Estudio comparativo de la prueba de redacción académica IELTS por escrito frente a la realizada en computadora en participantes de IELTS de inglés como lengua extranjera en la Universidad de Teherán

Mehdi Dastpak, Mohammad Javad Riasati, Mohammad Sadegh Bagheri, Ehsan Hadipour

Resumen


El estudio actual es un intento para investigar si los estudiantes se desempeñaron de manera diferente en la prueba de escritura del Sistema Internacional de Evaluación del Idioma Inglés (IELTS) tanto por escrito como en computadora en términos de respuesta/ logro de tareas, coherencia/cohesión, recurso léxico, rango gramatical y precisión. Además, explora si la familiaridad con la computadora de los candidatos era diferente en los grupos por escrito y en computadora. Para este propósito, se seleccionó un total de 108 candidatos de un total de 144 basándose en los resultados de la Prueba de Colocación de Oxford (OPT) en la Universidad de Teherán, Irán. Para recopilar los datos, se administró una muestra de redacción académica ya retirada del IELTS y un cuestionario de familiaridad con la computadora. Los participantes se dividieron en dos grupos iguales. En el grupo de Modo Escrito (PM), a los estudiantes se les dio la prueba para escribir en el modo de papel convencional. En el otro grupo Modo Computadora (CM), los estudiantes recibieron la misma prueba; sin embargo, se les pidió que escribieran la prueba en la computadora que se les proporcionó en su clase. Además, todos los participantes tomaron el cuestionario de familiaridad con la computadora. Los datos recopilados se analizaron mediante la prueba t de muestras independientes. Los hallazgos revelaron diferencias significativas entre el modo escrito y el modo en computadora en ambas tareas de escritura. Además, el análisis del cuestionario mostró el impacto de la familiaridad con la computadora de los candidatos en su desempeño en la escritura.

Palabras clave


prueba de escritura de alto impacto; modo en papel; modo en computadora; familiaridad con la computadora

Texto completo:

PDF (English)

Referencias


Asiyaban, A. R., Yamini, M., Bagheri, M. S., & Yarmohammadi, L. (2020). Implicit/explicit

knowledge and its contribution towards tense consistency employment across EFL

learners’ proficiency levels. Cogent Education, 7(1), 1727129.

Alves, R. A., Castro, S. L., de Sousa, L., & Stromqvist, S. (2007). Influence of typing skill on

pause-execution cycles in written composition. In M. Torrance, L. van Waes, & D.

Galbraith (Eds.). Writing and cognition: Research and applications (pp. 55–65).

Amsterdam: Elsevier.

Barkaoui, K. (2016). What and when second-language learners revise when responding to timed

writing tasks on the computer: The roles of task type, second language proficiency,

and keyboarding skills. Mod. Lang. Rev. 100(1), 320–340.

Barkaoui, K., & Knouzi, I. (2018). The effects of writing mode and computer ability on L2 test

-takers' essay characteristics and scores. Assessing Writing, 36, 19-31.

Blackhurst, A. (2005). Listening: Reading and writing on computer-based and paper-based versions of IELTS. Research Notes, 21, 14–17.

Breland, H., Lee, Y., & Muraki, E. (2004). Comparability of TOEFL CBT writing prompts: Response mode analyses (TOEFL research report No. RR-75)Princeton, NJ: ETS

Chambers, L. (2008). Computer-based and paper-based writing assessment: A comparative text

analysis. Research Notes, 34, 9–15.

Chan, S., Bax, S., & Weir, C. (2017). Researching participants taking IELTS Academic Writing

Task 2 (AWT2) in paper mode and in computer mode in terms of score equivalence, cognitive validity and other factors.

Chapelle, C. A., & Douglas, D. (2006). Assessing language through computer technology. Ernst Klett Sprachen.

Clariana, R., & Wallace, P. (2002). Paper-based versus computer-based assessment: key factors associated with the test mode effect. British Journal of Educational Technology, 33(5), 593-602.

Davies, A. (2007). Assessing academic English language proficiency: 40+ years of U.K. language tests. In J. Fox, M. Wesche, D. Bayliss, L. Cheng, C. E. Turner, & C. Doe (Eds.), Language testing reconsidered (pp. 73–86). Ottawa: University of Ottawa Press.

Douglas, D., & Hegelheimer, V. (2007). Assessing language using computer technology. Annual Review of Applied Linguistics, 27, 115–132.

Green, T., & Maycock, L. (2004). Computer-based IELTS and paper-based versions of IELTS. Research Notes, 18, 3–6.

Fulcher, G. (2014). Philosophy and language testing. In A. J. Kunnan (Ed.), The companion to language testing (pp. 1431–1451). Hoboken: Wiley.

Horkay, N., Bennett, R. E., Allen, N., Kaplan, B., & Yan, F. (2006). Does it matter if I take my writing test on computer? An empirical study of mode effects in NAEP. Journal of Technology, Learning and Assessment, 5(2), Retrieved from: www.jtla.org.

IELTS. (2014). Guide for educational institutions, governments, professional bodies and commercial organisations. Retrieved from

https://www.ielts.org/- /media/publications/guide-for-institutions/ielts-guide-for-institutions-uk.ashx?la=en

Jin, Y., & Yan, M. (2017). Computer literacy and the construct validity of a high-stakes computer-based writing assessment. Language Assessment Quarterly, 1–19.

Lee, Y. J. (2002). A comparison of composing processes and written products in timed-essay tests across paper-and-pencil and computer modes. Assessing Writing, 8(2), 135-157.

Li, J. (2006). The mediation of technology in ESL writing and its implications for writing assessment. Assessing Writing, 11(1), 5-21.

Lottridge, S., Nicewander, A., Schulz, M. & Mitzel, H. (2008). Comparability of Paper-based and Computer-based Tests: A Review of the Methodology. Pacific Metrics Corporation 585 Cannery Row, Suite 201 Monterey, California 93940.

Maycock, L. O. U. I. S. E., & Green, T. O. N. Y. (2005). The effects on performance of computer familiarity and attitudes towards CB IELTS. Research Notes, 20, 3-8.

Najmi, K. (2015). The effect of genre-based approach on enhancing writing skill of Iranian law students. Modern Journal of Language Teaching Methods, 5(2), 474.

Neuman, G., & Baydoun, R. (1998). Computerization of paper-and-pencil tests: When are they equivalent?. Applied Psychological Measurement, 22(1), 71-83.

O'Loughlin, K. (2011). The interpretation and use of proficiency test scores in university selection: How valid and ethical are they?. Language Assessment Quarterly, 8(2), 146-160.

Parsi, P., & Vahdani Sanavi, R. (2015). The effects of dynamic assessment on improving the writing ability of intermediate EFL learners. International Journal of Language Learning and Applied Linguistics World, 8 (2), 73-88.

Poggio, J., Glasnapp, D., Yang, X. &Poggio, A. (2005). A Comparative Evaluation of Score Results from Computerised and Paper & Pencil Mathematics Testing in a Large Scale State Assessment Program.The Journal of Technology, Learning and Assessment, 3(6), 5-30.

Puspawati, I. (2012). EFL/ESL (English as a Foreign/Second Language) Students' Perceptions toward the TOEFL (Test of English as a Foreign Language) Test.

Quaid, E. D. (2018). Reviewing the IELTS speaking test in East Asia: Theoretical and practice- based insights. Language Testing in Asia, 8(2), 25-48.

Russell, M., & Haney, W. (1997). Testing writing on computers: An experiment comparing

student performance on tests conducted via computer and via paper-and-pencil.

Education Policy Analysis Archives, 5(3), Retrieved from: http://epaa.asu.edu/epaa/v5n3.html.

Shohamy, E. G. (2001). The power of tests: A critical perspective on the uses of language tests. Pearson Education.

Torrance, M., & Galbraith, D. (2006). The processing demands of writing. In C. A. MacArthur,

S. Graham, & J. Fitzgerald (Eds.). Handbook of writing research (pp. 67–80). New York: Guilford Press.

Uysal, H. H. (2010). A critical review of the IELTS writing test. ELT Journal, 64(3), 314-320.

Van Waes, L., & Schellens, P. J. (2003). Writing profiles: The effect of the writing mode on pausing and revision patterns of experienced writers. Journal of pragmatics, 35(6), 829-853.

Weir, C. Y. R. I. L., & Shaw, S. T. U. A. R. T. (2005).

Establishing the validity of Cambridge ESOL writing tests: towards the implementation of a socio-cognitive model for test validation. University of Cambridge ESOL Examinations Research Notes, 21, 10-14.

Weir, C., Yan, J., O'Sullivan, B., & Bax, S. (2007). Does the computer make a difference?: The reaction of candidates to a computer-based versus a traditional hand-written form of the IELTS Writing component: effects and impact. International English Language Testing

System (IELTS) Research Reports 2007: Volume 7, 1.

Weir, C. J., O’Sullivan, B., & Jin, Y. (2007). Does the computer make a difference? The reaction of test-takers to a computer-based versus a traditional hand-written form of the IELTS writing component: Effects and impact. IELTS Research Report No.7British Council & IDP Australia311–347.

Wolfe, E. W., & Manalo, J. R. (2005). An Investigation of the Impact of Composition Medium on the Quality of TOEFL Writing Scores. TOEFL® Research Report. RR-72. ETS RR-04-29. ETS Research Report Series.

Wolfe, E. W., Bolton, S., Feltovich, B., & Niday, D. M. (1996). The influence of student experience with word processors on the quality of essays written for a direct writing assessment. Assessing Writing, 3, 123–147.

Yurdabakan, I. (2012). Primary School Students' Attitudes Towards Computer Based testing and Assessment in Turkey. Turkish Online Journal of Distance Education- 13 (12), 177-188.




DOI: http://dx.doi.org/10.15645/Alabe2021.24.6