June 2011
TESOL HOME Convention Jobs Book Store TESOL Community
ARTICLES
THE VALIDITY OF NONNATIVE SPEAKER INPUT IN LISTENING COMPREHENSION TESTS
Priyanvada Abeywickrama, Assistant Professor, English (MA TESOL Program), San Francisco State University, San Francisco, CA, USA, abeywick@sfsu.edu

English is currently spoken by more nonnative speakers than native speakers (Crystal, 2004). Therefore, in many target language use (TLU) domains, English use is not limited to native speakers. In higher education in the United States, students encounter varieties of English spoken by international students, international teaching assistants (TAs), and faculty (Gorsuch, 2003; Major, Fitzmaurice, Bunta, & Balasubramanium, 2005). The existence of different varieties of English leads to questions about the usefulness (Bachman & Palmer, 1996) of listening comprehension (LC) tests based only on a native speaker variety. This lack of correspondence between TLU tasks and test tasks limits the generalizability of tests and the validity of inferences made about test takers’ listening ability.

LITERATURE REVIEW

Major, Fitzmaurice, Bunta, and Balasubramanium (2002) pointed out that the relationship between accent and performance on LC tests is unclear. Nonnative varieties have been marked as the factor most contributing to comprehension difficulty (Goh, 1999). Flowerdew (1994, p. 24) furnished a summary of studies, some of which demonstrated listener advantage when listeners were familiar with the accent of the speaker, and some showed that accent familiarity provides no advantage in comprehensibility.

While research has looked at the effect of question preview (Sherman, 1997) and the influence of reading (Friedman & Ansley, 1990) on LC tests, the effect of accents (Ortmeyer & Boyle, 1985; Wilcox, 1978) has received limited attention. Most recently, Harding’s (2008) study found no significant difference in test takers’ LC performance when accented speech was used.

Another factor that confounds the use of nonnative varieties in listening comprehension is the attitudes toward accented speech (Lippi-Green, 1997). Nonnative speakers are often categorized as learners or uneducated or deficient speakers and stereotyped solely on their accents (Brennan & Brennan, 1981; Cargile, 1997). Harding (2008), however, found that listeners’ views on accented speech differed according to the purpose of the listening activity.

THE STUDY

The following research questions guided this study:

  1. Does the use of nonnative varieties of English reduce comprehensibility and affect test performance?
  2. Is there any interaction between test takers’ native language and the variety of English used in test input?
  3. What are test takers’ perceptions and attitudes toward nonnative varieties of speech in LC tests?

Data for this study were obtained from 110 test takers: the Koreans and Brazilians represent English as a foreign language (EFL) situations, and the Sri Lankans represent an English as a second language (ESL) situation, typical of students entering U.S. universities. Students were taking courses in various disciplines and were simultaneously enrolled in English classes at universities in their respective countries. Based on the performance on placement tests and in English classes, the Korean (N = 36) and Brazilian (N = 33) test takers were identified as having high-intermediate English proficiency, and Sri Lankan test takers (N = 38) as having low-intermediate proficiency.

The LC test consisted of eight listening texts, each followed by three or four comprehension questions for a total of 31 multiple-choice items. These listening passages were retired versions of the TOEFL exam (TOEFL Institutional Testing Program), with reliability estimates of 0.9 (TOEFL Test and Score Manual).

The speakers were TAs at leading universities in the United States: two native speakers of Chinese, one Korean, one Sri Lankan, and four native speakers of American English. The nonnative-English-speaking TAs had a Test of Spoken English (TSE) score of 50 or higher and had passed the Spoken Proficiency English Assessment (SPEAK) at their respective universities.

Test takers were randomly assigned to two groups based on their availability to take the test. Test takers in both groups heard Text 1, which was about American history and was read by an American TA. This was to help students familiarize themselves with the test. For the other passages the speakers were counterbalanced: Group 1 listened to passage 3 read by a Chinese speaker while Group 2 listened to the same text read by a Korean speaker.

Other instruments employed in the study included a background questionnaire administered at the beginning of the test to gather demographic information. Then, after each listening test, participants were asked to evaluate speaker comprehensibility using a 7-point Likert scale ranging from “strongly agree” to “strongly disagree.” After completing all eight LC tests, a final perception questionnaire was given to learn about test takers’ attitude toward the use of nonnative speaker input in LC testing.

RESULTS

A one-way analysis of variance (ANOVA) was conducted with the students’ performance as the dependent variable to answer research question 1. The difference in the mean performance across the two groups was not significant F (1, 0.565) = .452, p = 0.095, indicating that the use of nonnative speaker input did not make a difference in the participants’ performance.

Research question 2 asked if test takers would perform better when the speaker shared their first language (L1). A two-way ANOVA was conducted, where students’ performance was the dependent variable, and the input language and test takers’ L1 were independent variables. The interaction between the speaker’s L1 and the test takers’ L1 was not significant F (6, 0.103) = 0.996, p = 0.095, suggesting that, for example, Sri Lankan students did not perform significantly better than expected when the speaker was from Sri Lanka.

For research question 3, responses to the comprehensibility questionnaire did not provide any clear indication of perceptions of nonnative speech as test takers were not consistent in their judgment and were oftentimes unable to identify the cause of difficulty in comprehensibility. The attitude questionnaire, however, revealed that more than 62 percent of test takers preferred a native variety. While test takers seemed generally favorable of nonnative varieties (46 percent saying Yes because in real life they also hear nonnative varieties), when asked if using nonnative varieties in listening tests made a difference in test performance, 48 percent seemed certain that performance is better with native English. When asked if the type of speech input may cause test bias, almost 60 percent said that they thought Japanese test takers would better understand Japanese-accented English than would test takers from other countries. Finally, test takers were asked if they would prefer a native variety on a listening test even if a nonnative variety were easy to understand. They overwhelmingly responded Yes (65%).

DISCUSSION

Test performance data in this study suggest that nonnative varieties of English can be used as listening test input while attitude questionnaire results go counter. Although some test takers acknowledged the presence of nonnative or accented speech in their TLU domains, overwhelmingly they did not have a favorable attitude toward nonnative speech in tests. These findings support most of the previous research in this area.

Even though the test performance findings suggest the possibility of using nonnative varieties of English as test input, further research is necessary because of a number of limitations. First, the study was limited to three nonnative varieties of English. Second, although questions in each listening test had almost the same average difficulty (0.54 to 0.57 item difficulty), it was not possible to isolate the speaker input (nonnative English/accented English) as the only within-subjects factor. Another limitation of the study was the test takers’ level of language proficiency. In this study, all three groups were fairly similar, but different language proficiency levels may cause test takers to perform differently.

Until research shows that the use of nonnative English as input is irrelevant to the construct of listening, we may be unable to use such input in testing, especially in high-stakes tests such as TOEFL and IELTS. In order to address some of the issues of test usefulness, validity, and authenticity in particular, we should perhaps use nonnative varieties/accented English in low-stakes tests such as placement, diagnostic, or achievement tests in classrooms. Furthermore, introducing nonnative varieties of English in teaching listening and speaking skills will expose learners to a wider range of linguistic models and may help move learners away from the attraction of standard varieties. As nonnative speech gains recognition among language users and test takers, we hope the attitudes and perceptions toward the use of these varieties of English will change.

REFERENCES

Bachman, L., & Palmer, A. (1996). Language testing in practice. Oxford, England: Oxford University Press.

Brennan, E., & Brennan, J. S. (1981). Measurements of accent and attitude toward Mexican-American Speech. Journal of Psycholinguistic Research, 10, 487-501.

Cargile, Aaron C. (1997). Attitudes toward Chinese-accented speech: an investigation in two contexts. Journal of Language and Social Psychology, 16, 434-443.

Crystal, D. (2004). The language revolution. Cambridge, England: Polity.

Flowerdew, John. (1994) Academic listening: research perspectives. New York: Cambridge University Press.

Friedman, S. J., & Ansley, T. N. (1990). The influence of reading on listening test scores. The Journal of Experimental Education, 58(4), 301-310.

Goh, C. (1999). How much do learners know about the factors that influence their listening comprehension? Hong Kong Journal of Applied Linguistics, 4(1), 17-41.

Gorsuch, G. (2003). The educational cultures of international teaching assistants and U.S. universities. TESL-EJ, 7(3), A-1.

Harding, L. (2008) Accent and academic listening assessment: A study of test-taker perceptions. Melbourne Papers in Language Testing, 13(1), 1-33.

Lippi-Green, R. (1997). English with an accent. London: Routledge.

Major, C. R., Fitzmaurice, S. F., Bunta, F., & Balasubramanium, C. (2002). The effects of nonnative accents on listening comprehension: Implications for assessment. TESOL Quarterly, 36(2), 173-190.

Major, C. R., Fitzmaurice, S. F., Bunta, F., & Balasubramanium C. (2005). Testing the effects of regional, ethnic, and international dialects of English on listening comprehension. Language Learning, 55(1), 37–69.

Ortmeyer, C., & Boyle, J. P. (1985). The effect of accent differences on comprehension. RELC Journal, 16(2), 48-53.

Sherman, J. (1997). The effect of question preview in listening comprehension tests. Language Testing, 14(2), 185-213.

Wilcox, G. K. (1978). The effect of accent on listening comprehension: A Singapore study. English Language Teaching Journal, 25, 239-248.


A native of Sri Lanka, Priyanvada Abeywickrama teaches courses in TESOL methodology and curriculum and assessment in the MA TESOL Program at San Francisco State University. Her research in language assessment specifically examines issues of validity.

« Previous Newsletter Home Print Article Next »
Post a CommentView Comments
 Rate This Article
Share LinkedIn Twitter Facebook
In This Issue
LEADERSHIP UPDATES
ARTICLES
ABOUT THIS COMMUNITY
Tools
Search Back Issues
Forward to a Friend
Print Issue
RSS Feed
Poll
What other features would you like to see added to the new ALIS newsletter?
More about the members
Book reviews
How can ALIS help me
More articles from more members

BOOK REVIEWS WANTED!
Did you read a great book that applies to our field? Want to share it with the rest of us? If so, contact Olga Griswold  or Jana Moore about submitting a book review for our next edition of the ALIS newsletter. All reviews must be in APA format style, no more than 1,500 words, and submitted by September 30 for the Winter edition of the ALIS newsletter.

GRADUATE STUDENT CORNER COMING SOON!
Graduate students, coming soon to the ALIS newsletter is a section for you to tell us who you are and what your research interests are, and let us help you get connected in the Applied Linguistics Interest Section and meet fellow researchers. To get involved, contact Olga Griswold or Jana Moore.