TELPAS Research Critique

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 7

P.

Bell © May 2008 TELPAS


1

TELPAS: Texas English Language Proficiency Holistic Assessment Critique

Texas English Language Proficiency Holistic Assessment Critique

P. Bell

Dallas Baptist University


P. Bell © May 2008 TELPAS
2

Texas English Language Proficiency Assessment Critique

According to Technical Digest (2008), TELPAS – Texas English Language Proficiency

Assessment System began in 1995 as a measure to “evaluate the progress of LEP students

eligible by state law for exemption from the state-mandated assessments on the basis of limited

English proficiency” (p. 37). Moreover, Technical Digest (2008) advocated the fact that the

TELPAS idea gained viable concern and attention from TEA – Texas Education Agency and its

board members. As time progressed, the committee soon researched, created and implemented

the Reading Proficiency Test in English (RPTE) during the 1999-2000 school year. Later, the

Texas Observational Protocol (TOP) holistically rated components soon followed in 2004.

Today, both tests are referred to as the TELPAS – Texas English Language Proficiency

Assessment System. However, both components are given at different times yet the results from

the TELPAS holistically rated component often serves as an indicator for the TELPAS reading

comprehension component. Furthermore, Technical Digest (2008) suggests that there is a direct

correlation between the TELPAS reading comprehension test and the TAKS test (see table 1).

TELPAS does use test booklets, answer sheets, technical manuals, administration manuals,

online training, as well as campus training to serve both students and test administrators in the

test administration process according to TEA (2008). Mandated by TEA (2008), all districts are

responsible for “developing a local schedule to administer TELPAS assessment during a four-

week TELPAS testing window from March 17 – April 11” (p. 3).

Under the careful observation of Technical Digest (2008) and district administrators the

TELPAS test is designed to measure and monitor the growth of English language learners by

tracking adequate yearly growth and it is a means to determine one’s learning index, (p. 43).

Also, TEA (2008) advocates that through working with many experts such as bilingual/ESL

consultant, state assessment directors, research institutions, observational assessment gurus; TEA
P. Bell © May 2008 TELPAS
3

was able to develop criteria known as PLDs –Proficiency Level Descriptors, (p. 33). According

to TEA (2008), PLDs are specific language traits that are used to help determine and measure

very specific language development.Thus, a rubric for listening, speaking, and writing was

created as a way to identify language characteristics and traits that are common and true for any

child at any grade level in the scheme of language acquisition, (p. 3).

The items on the PLDs are broken into four categories per TEA (2008) instructional

manual. The categories are as follows, Beginning, Intermediate, Advanced, and Advanced High.

Moreover, there is a rubric for listening, speaking, and writing, (p. 38). The listening rubric

captures tendencies such as a student struggling to understand simple conversations, and this

behavior is classified as a beginner listener. Additionally, TEA (2008) outlines the facts that a

student who mainly speaks in short phrases consistently, and is unable to write or reflect

personal responses is also a beginner in the areas of speaking and writing. On the other hand, a

student who possesses the ability to understand elaborate conversations speaks with extended

discussion, and writes elaborately is considered advanced high in all categories: listening,

speaking, and writing, (pgs. 38-42). Therefore, the PLDs serve as researched indicators in

matching students with the appropriate listening, speaking, and writing abilities.

In fact, based on the rationale that much research spear-headed the creation of the TELPAS

to include the help of language specialists; TELPAS is considered a highly reliable testing

instrument per Technical Digest (2008) (p. 167). Also, Technical Digest supports their findings

with statistical data from the Kuder-Richardson which indicated the TELPAS as 1.0 reliable or

high 80s to low 90s reliable (p. 167). Assessments that fall in the range of 0.8-0.9 are better than

the tests that fall in the 0.95 or higher bracket because if the test is too reliable there may be

some signs of redundancy (pgs. 167-168). Additionally, since TELPAS uses a series of tests, this

also reduces the amount of bias involved. It is somewhat difficult to rid the test from all biases,
P. Bell © May 2008 TELPAS
4

when there are observational components involved and human error is definitely a possibility.

Yet, Technical Digest (2008), strongly suggests that the key to minimizing test biases means

implementing effective research that is continuously evolving over time focusing on the internal

consistency estimates as well as effective sampling studies (pgs. 125-129).

As far as test validity is concerned, Technical Digest (2008) sites evidence that extensive

research went into creating the test including counsel from language experts, psychometricians,

as well as observation gurus (p. 134). Furthermore, Technical Digest (2008) argues that valid

assessments connect the knowledge of a specific content such as the Texas Essential Knowledge

and Skills (TEKS) by measuring student results, which indicate a correlation between students

and the test data; thus, revealing a clear and consistent understanding of the content or subject

matter (p. 177). Moreover, the TELPAS appears appropriate for the examinee since the test does

cover skills that teachers cover from day to day in the classroom according to Technical Digest

(2008).

The TELPAS exam must be administered by certified, trained individuals only according

to TEA (2008) (p. 7). Furthermore, TELPAS holistic raters and verifying individuals require

training at three levels, i.e. the state, district, and campus levels. Based on TEA (2008) rater

credential information, raters must receive specialized training each year in order to perform

their duties and if the trainer does not pass their exam; another rater with passing credentials can

sign off on that rater’s work (TEA p. 6).

After the TELPAS is completed, counselors are responsible for submitting the following

information: Control Form, English language proficiency ratings for each student to include

beginning, intermediate, advanced or advanced high in each language domain, marked TELPAS

student roster, and the TELPAS answer documents (p. 5). Although, the state does not provide

any feedback for the holistic testing component; it is expected that students show progression
P. Bell © May 2008 TELPAS
5

each year. On the other hand, testing feedback is provided from the state for the TELPAS

reading comprehension component. Likewise, adequate yearly progress must be met for both

assessments individually and district wide.

References
P. Bell © May 2008 TELPAS
6

Technical Digest. (February, 2008). Student Assessment Division – Technical Digest 2006-2007.
Retrieved April 9, 2008, from
http://www.tea.state.tx.us/student.assessment/resources/techdig07/index.html
Texas Education Agency. (2008). Texas Training Center. Retrieved April 28, 2008, from
https://texas.pearson.desire2learn.com /
P. Bell © May 2008 TELPAS
7

Table 1. 2007 TAKS Scale Score Performance by RPTE Proficiency Rating for students Who

Participated in Both Assessments

Grade Level RPTE Proficiency Rating N Average TAKS

Reading/ELA Scale Score


3 Beginning 1,297 1976.27

Intermediate 5,378 2030.91

Advanced 10,120 2126.23

Advanced High 31,606 2294.56

4 Beginning 350 2007.81

Intermediate 1,886 1951.24

Advanced 12,111 2066.76

Advanced High 17,797 2227.53


5 Beginning 270 1971.81

Intermediate 846 1863.62

Advanced 7,751 1979.33

Advanced High 20,374 2144.04


6 Beginning 302 2025.27

Intermediate 601 1902.32

Advanced 9,419 2071.00

Advanced High 11,935 2251.27

Note: The scale scores necessary for the TAKS Met Standard and Commended Performance

levels are 2100 and 2400 respectively per Technical Digest (2008).

You might also like