Skip to content

In this blog post, we share our experiences in collaborating on a paper entitled ‘Unpacking Epistemic Insights of Artificial Intelligence (AI) in Science Education: A Systematic Review’ which is published in Science and Education. We argue that students need to develop ‘epistemic insights’ when they use AI tools, such as Generative Artificial Intelligence (GenAI) tools, to learn disciplinary-specific knowledge. ‘Epistemic insights’ refer to knowledge about knowledge, specifically ways of understanding how disciplines interact (Billingsley et al., 2018).

Much of the recent educational literature has focused on developing a generic set of students’ AI literacy (Ng et al., 2021; Su et al., 2023). In science, there are common features shared among disciplines such as chemistry, biology, physics and earth science. In this blog post, we discuss some of these general features in science and compare these features of AI. When students use ChatGPT in different academic subjects, such as science and mathematics, they need to appreciate both characteristics of the subject and characteristics of GenAI. We argue that students should question how scientific knowledge is generated in GenAI tools (such as ChatGPT, Sora and Google Bard). Asking these big questions can help students become aware of the strengths and limitations of GenAI tools, and develop their agency in using these tools for making meaning of scientific knowledge. For aims and values, students need to recognise that AI and science sometimes strive to produce high-quality intellectual outcomes. Also, the application of AI technologies in the field of science fosters interdisciplinary thinking. Some human qualities are involved when students apply AI to scientific knowledge, including problem-solving, creativity and co-operation, and algorithmic thinking.

‘Students should question how scientific knowledge is generated in GenAI tools … Asking these big questions can help students become aware of the strengths and limitations of GenAI tools, and develop their agency in using these tools for making meaning of scientific knowledge.’

In relation to methods, the disciplines of AI and science also involve observation and classification. AI, or specifically GenAI, does not engage in empirical scientific evidence itself, but it can serve as a tool for scientists to analyse data more accurately by mathematical predictions and modelling. GenAI analyses occurrence of word patterns and generates outputs based on statistical probabilities. On the other side, scientists can use these AI tools to identify patterns of scientific data, and draw scientific conclusions. A new technique has emerged to classify phases of physical systems in order to help scientists identify novel materials.

According to the latest curriculum reform in science education (NRC, 2012), students need to engage a range of practices in generating, validating and revising claims. Examples of these scientific practices include making a hypothesis, asking a question and analysing data. AI shares some of these practices with science as well. AI can help scientists visualise data, as well as analysing large-scale data using machine learning algorithms. AI can also help scientists identify patterns, abnormalities, trends and anomalies of the data. When scientists collect empirical evidence about a patient’s symptoms, for example, they can input such evidence in AI models to project the trend of disease progression, effectiveness of treatment, as well as outcomes of the treatment.

Social-institutional dimensions refer to aspects related to political, economic and social implications of a discipline. For social-institutional dimensions, either AI designers or scientists are prone to political and economic effects. AI itself lacked sensitivity regarding equity issues related to gender and race. When GenAI was asked to show images of people of different races, some tools refuse to generate images of people of a certain race. Also, an AI writing tool was trained by a male-authored dataset so its outputs reinforce gender bias. On the other hand, some medical research funded by pharmaceutical companies might be pressured to produce results that can promote the sales of pharmaceutical products.

In our latest paper (Cheung et al., 2024), we systematically reviewed literature focusing on the application of AI in science learning, as well as developing a pedagogical framework that guides K-12 education’s teaching and learning of science using AI technologies. In the centre of this pedagogical framework, students were guided to identify similarities and differences between AI technologies and science, as well as the application of AI in scientific knowledge. To achieve this, we listed five aspects regarding the interaction between science and AI technologies, aims and values, methods, practices, knowledge and social-institutional aspects. In the classroom, teachers can question the similarities, differences and relationships between AI technologies and science with regard to the five key aspects of epistemic insights.


Figure 1: Pedagogical framework of the application of AI in science learning

Source: Cheung et al. (2024)

As a group of early career researchers who are fascinated about the use of (Gen)AI tools in education, we are calling for more research efforts to examine educators, researchers and policymakers’ efforts to develop students’ epistemic insights into the relationship between a discipline and GenAI. Future research directions can explore students’ epistemic insights into the relationship between GenAI and a specific discipline by a range of methods including think-aloud interviews and classroom discourse.


References

Billingsley, B., Nassaji, M., Fraser, S., & Lawson, F. (2018). A framework for teaching epistemic insight in schools. Research in Science Education, 48(6), 1115–1131. https://doi.org/10.1007/s11165-018-9788-6

Cheung, K. K. C., Long, Y., Liu, Q., & Chan, H. Y. (2024). Unpacking epistemic insights of artificial intelligence (AI) in science education: A systematic review. Science & Education. Advance online publication. https://link.springer.com/article/10.1007/s11191-024-00511-5

National Research Council [NRC]. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. The National Academies Press.

Ng, D. T. K., Leung, J. K. L., Chu, S. K. W., & Qiao, M. S. (2021). Conceptualizing AI literacy: An exploratory review. Computers and Education: Artificial Intelligence, 2, 100041. https://doi.org/10.1016/j.caeai.2021.100041

Su, J., Ng, D. T. K., & Chu, S. K. W. (2023). Artificial intelligence (AI) literacy in early childhood education: The challenges and opportunities. Computers and Education: Artificial Intelligence, 4, 100124. https://doi.org/10.1016/j.caeai.2023.100124