Málfríður - 15.03.2010, Blaðsíða 16

Málfríður - 15.03.2010, Blaðsíða 16
proficiency may contribute towards the high drop- out rate at the University of Iceland; (2) Phillipson’s claim that Danes are not as proficient in English as the status of the language in Denmark would lead one to believe; (3) Hellekjær’s findings that many Norwegian university students have difficulty read- ing academic English; it is clear that we need to assess how well Icelandic university students are able to read academic English, and not be lulled by Icelanders’ apparent comfort and skill in talking casually. To test the reading proficiency of Icelandic uni- versity students, reading passages were used from students’ own courses. I did this first because, as Peretz and Shoham found (1990), “EFL students pre- fer texts on topics that are, or appear to be, related to their field of study”, which they rate to be “easier”. (Although Peretz and Shoham go on to point out that students’ “hunches” about what is easy “are not always a reliable index of their performance”.) In other words, I wanted students to be motivated to read texts which they would consider relevant. Second, choosing reading material from students’ courses meant that the tests would allow predictions to be made of students’ success in dealing with their actual English reading requirements, although of course using different tests meant that comparisons could not be made between faculties. The effectiveness of using reading tests contain- ing students’ background knowledge is disputed. Carrell (1983) found that “nonnative readers show virtually no significant effects of background knowl- edge,” as did Clapham (1990). On the other hand, (Just et al, 1982) found that if “readers are already familiar with the content, then the comprehension test may be probing previous knowledge rather than comprehension.” In any case, I ensured that the read- ing test passages were texts that the students had not yet read, so although they were in the students’ cho- sen field, they were likely to contain some unknown information or an unconsidered perspective. I began by carrying out a small-scale pilot study with third-year undergraduate students in an American culture course. The reading section of the test included an off-the-shelf TOEFL-like test on slavery (a topic that the students had already encountered), and a reading comprehension test that I had developed, based on an academic text (and topic) that the class had not yet considered, the rela- tive lack of social mobility in post-Reagan America (Weir, 2007). I also administered a listening test, on which I will not report here, except to mention that the short-answer test items of this test proved to be more reliable than its multiple choice test items. The TOEFL-like reading test scores were shown to be unreliable, with many students scoring per- fectly, so these scores were disregarded. (Of course as a summative course test, this would be the result one would hope for, but as a research instrument, students’ wholesale “acing” of this test on a familiar topic was not useful.) On the other hand, the results of the reading comprehension test on the rise of “Class in America” since the 1980s, a new concept for the students, were more interesting. The average score was 69%, with about half the students scor- ing above 60% and about half scoring below 60%. Considering the fact that these were third year stu- dents taking the pilot test, I was relieved that almost everyone passed. It was clear from the pilot that it was in fact possible to administer a reading test based on a course-related text, as long as the topic had not been widely covered, and it also became clear that test reliability using short-answer ques- tions was indeed possible. Incoming students in Science and Engineering engage in a combined orientation session spanning a couple of days in September. During this time, one of the many activities they carry out is reading and discussing an English text which highlights some of the differences between science and engineering, such as the different place that theory holds within each discipline, and the two fields’ different con- cepts of what “knowledge” is. This year’s text was to be “The Wisdom of Engineers” (McCarthy, 2009), administered to the students who attended on day two of the orientation, numbering over 300. I cut two theoretical sections from the original text, largely to edit its size from 2,500 words to just over 2,300. Students were allotted 50 minutes to read the text and answer the 10 questions with short English or Icelandic answers. Answers were worth 1 or 2, depending on the complexity of the questions, and there were no half points given. As students began finishing the test (or appear- ing to finish the test) after only 25 minutes, the ori- entation administrator suddenly announced to the students that they would have five more minutes to complete the test. Unfortunately, having only 30 minutes to answer 10 questions on a rather com- plex academic text meant that too few students (just under 20%) completed the whole test. Therefore, in discussing these students’ reading proficiency, only answers to the first six questions will be analysed here, since most participants did complete these first six answers. 1 MÁLFRÍÐUR

x

Málfríður

Beinir tenglar

Ef þú vilt tengja á þennan titil, vinsamlegast notaðu þessa tengla:

Tengja á þennan titil: Málfríður
https://timarit.is/publication/1081

Tengja á þetta tölublað:

Tengja á þessa síðu:

Tengja á þessa grein:

Vinsamlegast ekki tengja beint á myndir eða PDF skjöl á Tímarit.is þar sem slíkar slóðir geta breyst án fyrirvara. Notið slóðirnar hér fyrir ofan til að tengja á vefinn.