Blog post
Reflections on using western survey techniques in Chinese contexts
This blog presents a possible example to apply western survey instruments to probe into the impact of gender and social economic status (SES) on career choices of engineering undergraduates in China. I designed a questionnaire by drawing on, and translating items in, the Sustainability and Gender in Engineering Survey (Godwin, 2014) and the Science Capital Index (Archer et al., 2015). However, if these two instruments that are designed for western contexts (the US and the UK) are directly applied to Chinese participants, they may cause confusion with some wordings that are not familiar in their culture. So in April 2022, I piloted the integrated questionnaire with a small group of Chinese students via ‘think-aloud’ methodology in order to adapt it to Chinese contexts. Participants were asked to voice any confusion or hesitation they had when taking the survey. ‘Think-aloud’ interviews can establish the cognitive validity of an instrument (Trenor et al., 2011).
As Virzi (1992) has suggested, four or five participants can uncover up to 80 per cent of the potential problems associated with survey instruments; I recruited five participants. They are Chinese engineering undergraduates, including males and females from different SES backgrounds and in various years of study at Qingdao University and China University of Petroleum. Two rounds of think-aloud interviews were conducted. Three participants took part in the first round, and refinements on the questionnaire were made based on their feedback. To ensure the revisions had successfully remedied any potential problems with the survey, the other two students participated in the second round of interviews by completing the revised questionnaire. Drawing on the analysing models developed by Willis et al. (1991), I categorised problems that participants had when interacting with the questionnaire as ‘structural’ issues and ‘cognitive’ issues, with the acknowledgement that some issues touch upon elements of both. Structural issues contain grammatical errors and issues in the design of the survey structure. Cognitive issues refer to problems that may cause misunderstandings, especially due to inappropriate translation and different contexts.
An example of structural issues detected
Ranking their choices of various methods that they adopted to deal with academic problems in engineering was regarded as time-wasting by all three participants in the first round of interviews. With the assumption that this may discourage some survey participants to finish the survey, I deleted this question.
An example of cognitive issues detected
Mentioning ‘engineering’(工科), an umbrella term for the study of applied techniques, throughout the survey made three students feel confused, as they are from different sub-subjects under engineering – such as bioengineering and civil engineering. Based on the grammar rules of Chinese, I replaced ‘engineering’ (工科) with ‘your subject/major’ (专业).
‘By identifying structural and cognitive issues from participant feedback, I was able to optimise question types and use contextually relevant language when translating western survey tools into Chinese.’
By identifying structural and cognitive issues from participant feedback, I was able to optimise question types and use contextually relevant language when translating western survey tools into Chinese. Based on my experience, I would like to provide three possible suggestions for researchers who are interested in adapting a survey tool for different contexts.
- Recruit four or five participants and divide them into two groups to take part in different rounds of think-aloud interviews. This strategy can help ensure any potential amendments to the survey are appropriate.
- During the interview, pay close attention to every tiny pause and frown of your participants when they are taking the questionnaire to ensure the survey is fit-for-purpose.
- Consider the structural and cognitive problems of the translated instrument when dealing with the feedback of your participants. This is the most significant step to identify issues in the survey tool that might otherwise induce confusion due to exposure to different cultures.
References
Archer, L., Dawson E., DeWitt, J., Seakins, A., & Wong, B. (2015). ‘Science capital’: a conceptual, methodological, and empirical argument for extending Bourdieusian notions of capital beyond the arts. Journal of Research in Science Teaching, 52(7), 922–948. https://doi.org/10.1002/tea.21227
Godwin, A. F. (2014). Understanding female engineering enrolment: Explaining choice with critical engineering agency. [Doctoral Dissertation, Clemson University]. https://tigerprints.clemson.edu/all_dissertations/1787
Trenor, J. M., Miller, M. K., & Gipson, K. G. (2011). Utilization of a think-aloud protocol to cognitively validate a survey instrument identifying social capital resources of engineering undergraduates. Paper presented at 2011 American Society for Engineering Education Annual Conference & Exposition, Vancouver, BC. https://peer.asee.org/18492
Virzi, R. A. (1992). Refining the test phase of usability evaluation: How many subjects is enough? Human Factors, 34(4), 457–468. https://doi.org/10.1177/001872089203400407
Willis, G. B., Royston, P., & Bercini, D. (1991). The use of verbal report methods in the development and testing of survey questions. Applied Cognitive Psychology, 5, 251–267. https://doi.org/10.1002/acp.2350050307