TY - JOUR
T1 - Linguistic dimensions of impromptu test essays compared with successful student disciplinary writing
T2 - Effects of language background, topic, and L2 proficiency
AU - Weigle, Sara Cushing
AU - Friginal, Eric
N1 - Funding Information:
NNS data for this paper were collected for a project sponsored by the TOEFL Committee of Examiners and the TOEFL program at Educational Testing Service entitled “Validation of Automated Scoring of TOEFL1 iBT Tasks Against Non-Test Indicators of Writing Ability.” Funding for NS data collection was provided by the Cleon C. Arrington Research Initiation Grant Program at Georgia State University. One prompt asks test takers to agree or disagree with the proposition that people focus too much on personal appearance (hereinafter: the Appearance topic), while the other prompt asks test takers to discuss the importance of planning for the future (hereinafter: the Future topic). Participants were given 30 min to respond to each topic on a computer; topics were counterbalanced so that half the participants responded to one topic first and the other half responded to the other topic first. Essays from 386 non-native speakers were collected by the first author as part of a larger study looking at automated scoring of TOEFL writing ( Weigle, 2010, 2011 ). The participants in Weigle's study were approximately equally divided between graduate and undergraduate students, along with a small number of pre-matriculated students enrolled in an English language institute. The essays from 150 NS undergraduate students from a large public research university were collected by the second author for the purpose of making comparisons between native and non-native speaker writing on the same topics.
Publisher Copyright:
© 2015.
PY - 2015/6/1
Y1 - 2015/6/1
N2 - One important validity question with regard to writing assessment is the degree to which performance on a timed writing test can predict performance on future academic writing. Recent developments in corpus linguistics have allowed scholars to describe in detail the linguistic features of a variety of academic texts, including genres of disciplinary writing and writing on essay tests, which can aid in answering this question. The purpose of this paper is to compare the linguistic features of test essays written by native and non-native speakers with a comparison corpus of successful student writing across a range of disciplines using Biber's (1988) multidimensional analysis framework. Essays written on two different test prompts were analyzed along dimensions of successful student writing revealed by an analysis of the Michigan Corpus of Upper-level Student Writing (MICUSP) conducted by Hardy and Römer (2013). Results demonstrated that test essays differed in significant ways from disciplinary writing, particularly in the natural and health sciences. Furthermore, language background (native vs. non-native), prompt, and language proficiency (i.e., essay scores) were systematically related to scores on all four dimensions. Implications for pedagogy and language assessment are discussed.
AB - One important validity question with regard to writing assessment is the degree to which performance on a timed writing test can predict performance on future academic writing. Recent developments in corpus linguistics have allowed scholars to describe in detail the linguistic features of a variety of academic texts, including genres of disciplinary writing and writing on essay tests, which can aid in answering this question. The purpose of this paper is to compare the linguistic features of test essays written by native and non-native speakers with a comparison corpus of successful student writing across a range of disciplines using Biber's (1988) multidimensional analysis framework. Essays written on two different test prompts were analyzed along dimensions of successful student writing revealed by an analysis of the Michigan Corpus of Upper-level Student Writing (MICUSP) conducted by Hardy and Römer (2013). Results demonstrated that test essays differed in significant ways from disciplinary writing, particularly in the natural and health sciences. Furthermore, language background (native vs. non-native), prompt, and language proficiency (i.e., essay scores) were systematically related to scores on all four dimensions. Implications for pedagogy and language assessment are discussed.
KW - Multi-dimensional analysis
KW - Validation
KW - Writing assessment
UR - http://www.scopus.com/inward/record.url?scp=84946157683&partnerID=8YFLogxK
U2 - 10.1016/j.jeap.2015.03.006
DO - 10.1016/j.jeap.2015.03.006
M3 - Journal article
AN - SCOPUS:84946157683
SN - 1475-1585
VL - 18
SP - 25
EP - 39
JO - Journal of English for Academic Purposes
JF - Journal of English for Academic Purposes
ER -