Blog post Part of special issue: Advancing pedagogic research across disciplines: Innovations, challenges and best practice
Is the pen mightier than the keyboard? Handwritten and typed exams compared
With the increased adoption of technologies, the use of electronic examination scripts is gradually becoming the norm and is likely to replace the traditional handwritten ones in universities around the world (Chan, 2023). Teachers celebrate this change, which relieves them from the chore of having to decipher illegible scripts during marking, while administrative colleagues appreciate the cost savings from the reduction in the use and storage of paper. Students seem to welcome the idea of shifting from handwritten examination scripts to electronic ones; over 90 per cent of the students in my faculty choose to type rather than write their examination answers for many of the courses. In this blog post, I report my preliminary findings from a pilot study comparing the differences between handwritten and typed examination scripts.
I was fine with the change to electronic scripts as I could do my marking at different locations without having to carry bundles of scripts with me. However, when marking the scripts, I made some initial observations about the differences between the typed and the handwritten ones. At that time, as I was focused on finishing my marking before the deadline, so I did not delve into the differences too much; I had a general feeling that the typed scripts tended to be much lengthier, but they did not seem to add much value to the answers provided. For handwritten scripts, although shorter in length, they seemed to reflect that more thought had gone into the answers provided in them. As examined by Aragon-Mendizabal et al. (2016), differences between using handwritten and typed methods depend on the task at hand. Some tasks might require a deeper level of analysis, while others might only involve low-level processing, such as reciting information.
‘I had a general feeling that the typed scripts tended to be much lengthier, but they did not seem to add much value to the answers provided.’
To better understand this situation, I decided to do a pilot study on one of my courses to investigate whether there are any differences in the quality of typed answers compared with handwritten answers. A course on understanding financial statements and solicitors’ accounts in 2023/24 was selected for the study, as it was one of the few courses in which there were sufficient handwritten examination scripts to allow for a meaningful comparison with the typed ones. Usually, fewer than 10 per cent of the students would choose to sit for the handwritten examination, which might lead to unrepresentative findings due to the low number of handwritten scripts. However, for this course, 43 (42 per cent) of the 103 students chose to sit for the handwritten examination, providing a good sample size for the study.
The document analysis approach (Bowen, 2009) was adopted to explore the differences between the two script types. The preliminary findings in terms of word count confirmed my initial observation. Based on the initial sample, the total number of words typed by students during the examination was around 30 per cent higher than that for handwritten scripts. When the numbers were further analysed for specific types of questions, the number of words typed for a rule-based question were around 90 per cent higher than that for handwritten scripts. Rule-based questions involved the analysis and interpretation of a particular rule. Students using electronic scripts might be prone to typing the entire rules rather than summarising the main points, which might yield little or no benefit to them in the examination when time is limited. However, the type of questions would only be one factor that might influence the approach adopted by students. Other factors – including questions that involve reflection, critical thinking or higher order thinking – can also come into play.
‘Although most students can type much faster than they can write, it would be more beneficial for them to spend more time planning their answers (perhaps even with pen and paper) before they start typing away at light speed.’
It has been a rewarding experience doing this pedagogical research, as I was able to gain a better understanding of how my students tackle examination questions. By taking time to analyse the data, it was possible for me to conceptualise factors that contribute to why some students performed well, while others did not, in certain areas. With evidence derived from the research, the findings can provide support for whether a particular approach might yield better results for my students in examinations. For example, although most students can type much faster than they can write, it would be more beneficial for them to spend extra time planning their answers (perhaps even with a pen and paper) before they start typing away at great speed. As such, I believe pedagogic research can help to improve my teaching at the university and enhance the learning experience for my students.
References
Aragón-Mendizábal, E., Delgado-Casas, C., Navarro-Guzmán, J. I., Menacho-Jiménez, I., & Romero-Oliva, M. F. (2016). A comparative study of handwriting and computer typing in note-taking by university students. Comunicar: Media Education Research Journal, 24(48), 101–107. https://doi.org/10.3916/C48-2016-10
Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research Journal, 9(2), 27–40.
Chan, C. K. Y. (2023). A systematic review – handwritten examinations are becoming outdated, is it time to change to typed examinations in our assessment policy? Assessment & Evaluation in Higher Education, 48(8), 1385–1401.