8,015
views
0
recommends
+1 Recommend
1 collections
    2
    shares
      scite_
       
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Relationships between adolescent students’ reading skills, historical content knowledge and historical reasoning ability

      research-article
      Bookmark

            Abstract

            The ability to apply various reading skills is an important prerequisite to comprehend expository texts commonly found in history textbooks, but it is unclear which specific skills contribute to students’ historical content knowledge and historical reasoning abilities. This study used a digital learning environment (DLE) to measure and support lower secondary students’ subject-specific reading skills, and explored the relationships with students’ historical content knowledge and historical reasoning ability. Results showed that subject-specific reading skills, such as explaining historical events, correlated significantly with both historical content knowledge and historical reasoning ability, but not all skills were significant predictors. These findings indicate that to promote the advanced practice of historical reasoning, history teachers should pay attention to students’ reading comprehension skills.

            Main article text

            Introduction

            Many authors concerned with history education have advocated for a specific disciplinary approach to literacy instruction in history classrooms, emphasising the importance of practices such as ‘reading like a historian’ and promoting historical thinking and reasoning (for example, perspective taking, contextualisation or sourcing information) for students’ comprehension of historical texts (Moje, 2015; Monte-Sano, 2011; Reisman, 2012; Shanahan and Shanahan, 2008; Wineburg and Reisman, 2015). According to Wineburg and Reisman (2015: 636), students who are only able to implement basic reading comprehension strategies ‘will remain spectators, passively gazing at the arena of knowledge production’. Although we agree that a disciplinary literacy approach offers a valuable framework for sophisticated reading and understanding of historical texts, we also stress the importance of combining this approach with generic reading skills for reading and comprehending the expository format often found in regular history textbooks that are used in secondary education. In Dutch secondary education, expository texts differ from the narrative texts that are more common in primary education. Reading these expository texts can be challenging for adolescent students who recently transferred from primary to secondary education (Fry and Gosky, 2007), or for those who struggle with this type of text (Faggella-Luby et al., 2012; Okkinga et al., 2018), resulting in lower performance scores and limited reading motivation. This study aims to uncover how students can benefit from support with regard to reading skills in their history lessons.

            A literacy-oriented approach to history education

            In the lower secondary education history curriculum of many countries, the transfer of information relies heavily on the expository format commonly found in textbooks. More specifically, in the Dutch history curriculum, human history – from prehistoric times until today’s information age – is covered in three years. Therefore, Dutch history textbooks often include fact-dense texts that contain an abundance of novel concepts, perspectives and vocabulary (Afflerbach and VanSledright, 2001; Mastropieri et al., 2003; Ramsay et al., 2010). Students who recently transferred from primary to secondary education will need to adapt their reading process accordingly and continue their development towards subject-specific reading proficiency for the subject of history (Alexander, 2003; VanSledright, 2004).

            Since the 1980s, it has been widely agreed upon that students’ comprehension or knowledge of a subject can be fostered through reading strategy instruction. Strategies such as activating prior knowledge, identifying main ideas and reflecting on the reading process are found to be effective for students’ academic performance in general (National Reading Panel, 2000; Okkinga et al., 2018; Palincsar and Brown, 1984), as well as for the subject of history (McKeown et al., 2009; Ramsay et al., 2010; Vaughn et al., 2013). However, despite the effectiveness of reading strategy instruction, researchers have adopted a more critical stance towards the instruction of reading strategies as a goal in itself, advocating instead for embedding reading strategy instruction in the curriculum of a specific discipline (McKeown et al., 2009; Moje, 2015). By doing so, it is assumed that reading texts will be more purposeful and effective.

            The ability to apply relevant strategies when involved in reading texts for a specific school subject is commonly known as content-area literacy or disciplinary literacy. The concept of disciplinary literacy follows the assumption that students need to apply different reading strategies and heuristics for different text subjects (Goldman et al., 2016; Moje, 2008, 2015; Shanahan and Shanahan, 2008). For the subject of history, disciplinary literacy practices are often based on how historians (Wineburg, 1991, 1998) or expert readers (Shanahan et al., 2011) read and interpret historical texts. These texts often consist of primary or secondary source material, which enables students to apply expert reading practices such as sourcing (Where does this information come from?), contextualisation (What were the characteristics of the time and society in which this was written?) and corroborating (Is there similar or contradictory information available in other sources on this topic?) – which are advanced skills that students need to develop throughout their academic career (Shanahan and Shanahan, 2008; Wineburg, 1991).

            This focus on expert historians’ reading processes led to the development of several research programmes that focused on disciplinary literacy practices and their effectiveness for students’ knowledge and comprehension of historical texts (Girard and McArthur Harris, 2012; Hynd et al., 2004; Learned, 2018; Monte-Sano, 2011; Monte-Sano et al., 2014; Nokes et al., 2007; Reisman, 2012; Wineburg and Reisman, 2015). For example, the text-based method called Questioning the Author, in which students had to identify and critically evaluate the author’s background and stance, proved effective for students’ comprehension and the self-monitoring thereof (McKeown et al., 2009). Although the disciplinary approach often has been shown to be beneficial, some researchers dispute its value, emphasising the importance of generic strategy instruction for adolescents who struggle with reading (Faggella-Luby et al., 2012). Other research points to the possible barriers for students in (lower) secondary history classes, such as students’ lack of background knowledge, lack of experience in using heuristics to reason critically about historical texts, or their susceptibility towards presentism – which occurs when people use their own contemporary frame of reference to explain events from the past (Duhaylongsod et al., 2015; Nokes, 2011; Perfetti et al., 1995; Wineburg, 2001).

            Even though learning from text is a complex process, there seems to be a bidirectional relationship between reading strategy instruction and disciplinary literacy. Mastering generic reading strategies, or ‘knowing how to read’, is an important prerequisite for the application of disciplinary literacy practices. Conversely, reading history texts provides students with the opportunity to develop their knowledge of historical facts, concepts and chronology, as well as their critical reading skills, such as determining an author’s perspective. This implies that ‘getting good at history reading may significantly contribute to students’ general ability to read critically’ (Afflerbach and VanSledright, 2001: 697), a skill deemed important in the age of endless resources, information overload and fake news (Wineburg, 2018).

            Researchers recently called for a more literacy-oriented approach towards history education (Wissinger et al., 2018). Following this approach, students develop interpretations of the past while embedding historical thinking, reading and writing: ‘The central components of these processes involve understanding the historical context (or background), questioning and weighing evidence found in primary sources, constructing interpretations of the past based on analysis of evidence, and conveying interpretations into written historical argument’ (Wissinger et al., 2020: 3). This implies that history education should not only focus on the (textual) transmission of knowledge, but also on the process of historical reasoning and its underlying competencies, of which historical content knowledge is one.

            Historical reasoning and historical content knowledge

            Historical reasoning, often also referred to as historical thinking, encompasses a complex construct of the learning process in which students are involved when confronted with historical texts or sources. Although it is widely studied, especially in the Netherlands, there is no homogeneous conception of what students’ historical reasoning competency entails, nor of how it should be operationalised in classrooms (Luís and Rapanta, 2020), even though Van Boxtel and Van Drie (2018) offer some suggestions in their work. The concept of historical reasoning is closely related to disciplinary literacy, since expert historians often use heuristics to reason historically while reading texts (Nokes et al., 2007; Wineburg, 1991). According to Van Boxtel and Van Drie (2018; see also Van Drie and Van Boxtel, 2008), historical reasoning practices enhance students’ knowledge and understanding of historical events, ideas and developments. The core of their well-known historical reasoning framework (see Figure 1) consists of three elements that students can reason historically about: (1) continuity and change; (2) causes and consequences; and (3) similarities and differences. In addition, their framework defines six components of historical reasoning, which can be translated into concrete learning activities: (1) historical contextualisation; (2) using historical concepts; (3) using metahistorical concepts; (4) using historical sources; (5) providing (counter) arguments; and (6) asking historical questions.

            Figure 1.

            Types and components of historical reasoning and individual and sociocultural resources for historical reasoning (Source: Van Boxtel and Van Drie, 2018; reproduced with permission of the authors)

            Although this historical reasoning framework neither distinguishes difficulty levels of the six components nor suggests a certain order in which they should be developed, some components are, in our view, more suitable for lower secondary education practice. Lee and Ashby (2000) investigated the development and changes in students’ ideas about the past, focusing on students between 7 and 14 years old. Their results show that the developmental progress of students’ historical reasoning ability is rather complex. Even though older students were well able to reason historically about concepts such as ‘evidence’, which relates to the practice of argumentation and critical analysis in the historical reasoning framework of Van Boxtel and Van Drie (2018), there were major differences between and within different age groups. Lee and Ashby (2000: 216) argue that the history curriculum should focus more on students’ reasoning ability, and thereby improve students’ intellectual toolkit, but that this entails ‘a complex of multitrack understandings’; that is, students need to learn how to make claims about the past, but also how to substantiate or overturn these claims. To do so, they need a knowledge base of historical facts, concepts and chronology.

            While it is evident that historical reasoning and the reading of historical texts are closely related (Van Boxtel and Van Drie, 2018), the question remains as to how we can support students’ historical reasoning ability through the reading of expository history texts in lower secondary education.

            Although there is an ongoing debate about what history education should entail, and whether teachers’ instruction should focus more on substantive historical knowledge or on the ability to reason historically, it is important to note that these two concepts cannot completely be separated in practice (Gestsdóttir et al., 2018; Lee and Ashby, 2000). Reading historical texts involves the integration of prior knowledge with new information found in these texts (Wissinger et al., 2020). Students will need a sufficient prior knowledge base to be able to apply higher-order thinking skills, such as reasoning historically about causes and effects (Kirschner et al., 2006; Van Boxtel and Van Drie, 2018). This knowledge base includes both first-order knowledge (for example, facts, dates and figures) and second-order knowledge, which is needed to construct a narrative of the past (for example, cause, evidence or change; VanSledright and Limón, 2006). Subsequent dimensions of historical expertise include strategy knowledge, epistemological beliefs and situational interest (Stoel et al., 2015; VanSledright and Limón, 2006), but these are beyond the scope of the current study. However, ‘in most conceptualizations of historical reasoning … the role of first-order knowledge is barely explicated’ (Van Boxtel and Van Drie, 2018: 156). Therefore, the current study focuses on both students’ historical content knowledge and their historical reasoning ability, investigating their relationship with students’ subject-specific reading skills.

            Reading texts in a digital learning environment

            Technology-enhanced learning environments are increasingly used to support students’ reading and learning processes. Meta-analyses and other studies have shown that the use of digital technology has a positive effect on students’ reading performance in both primary and secondary education (Cheung and Slavin, 2012; Lan et al., 2014; Lynch et al., 2000; Moran et al., 2008), including research focusing on history education (for a literature review, see O’Neill and Weiler, 2006 on cognitive tools, and Poitras et al., 2012 on metacognitive tools). Digital text formats, which, for example, enable the use of hyperlinks, enhance students’ autonomy and provide individual flexibility and support. Devolder et al. (2012) conclude from their systematic review on scaffolding in computer-based learning environments that digital hints appear to be effective scaffolds, especially as support to stimulate the use of learning strategies. For example, hints can improve students’ effort regulation by suggesting what actions to perform when confronted with difficulties while reading texts.

            Digital environments provide researchers with the possibility to mine and translate data to detect, analyse and foster students’ learning processes – an approach commonly known as learning analytics (Azevedo and Gašević, 2019). However, while educational technology develops rapidly, the research on the effectiveness of these technological developments on students’ reading performance progresses at a slightly slower pace. Reported effect sizes are often small, and few studies are aimed at secondary grade levels or content subjects such as history (Moran et al., 2008). Moreover, it remains somewhat unclear which support characteristics in computer-supported learning environments contribute to secondary students’ reading process, because studies often lack a detailed description of the actual content or focus of the provided support (Devolder et al., 2012; ter Beek et al., 2018a).

            The current study

            Research has shown that both general reading skills and disciplinary literacy skills can contribute to students’ understanding of the text and their content knowledge (Learned, 2018; Nokes et al., 2007). We refer to a combination of these two concepts as ‘subject-specific reading skills’. The cognitive approach towards historical reasoning emphasises the role of mental resources, one of which is students’ reading ability (Van Boxtel and Van Drie, 2018). For example, recognising causal relations is a generic comprehension skill that enables students to identify how one sentence relates to another (for example, by focusing on connectives), but for the specific subject of history it also enables students to reason causally about how certain historical events are related – an important skill in history education (Stoel et al., 2015). The same applies to skills such as finding explanations, generating questions, finding main ideas and perspective taking (Huijgen et al., 2018; Van Boxtel and Van Drie, 2018; Wineburg, 1991).

            Technology-enhanced learning environments are increasingly used to support students’ reading and learning processes, including research focusing on history education. The current study integrated the aforementioned subject-specific reading skills in a digital learning environment (DLE) that was used by lower secondary students to read expository history texts (ter Beek et al., 2018b). We subsequently used its log file data to analyse students’ performance on subject-specific reading skills.

            We expect that all subject-specific reading skills will be positively related to students’ historical content knowledge and historical reasoning ability. To test this hypothesis, the main research question this study addresses is: What is the relationship between students’ subject-specific reading skills, as measured in the DLE, and their historical content knowledge and historical reasoning ability?

            Method

            Design and context

            Students from seventh grade (the first year of Dutch secondary education) read six expository texts for the subject of history in a DLE. The main theme of all texts was ‘Ancient Greece’, and each text contained approximately five hundred and fifty words. All texts were written in an expository, non-fictitious format, and had equal difficulty levels. Themes of the texts were ‘The tough Spartans’, ‘Democracy in Athens’, ‘Ancient Greek trading systems’, ‘Ancient Greek religion’, ‘Doctors in ancient Greece’ and ‘The world of Homer’. We carefully analysed the contents of students’ regular textbooks to prevent overlap or duplicate information in all texts. Figure 2 shows a screenshot of the DLE used in this study.

            Figure 2.

            Screenshot of the DLE showing the contents of a metacognitive hint (Source: University of Groningen)

            The DLE offered reading strategy instruction during reading in the form of hints, which students could deliberately decide to access when they thought they needed them. There were three types of hints: cognitive, metacognitive and motivational. Cognitive hints consisted of strategy instruction or explanations about the content of the text. Metacognitive hints were aimed at students’ regulation of their learning process. Motivational hints pointed out the value of the reading task (the ‘why’ of the task) and what students might learn by reading the text (the usefulness of the task). However, the majority of students made little to no use of these strategic hints; hence, in the remainder of this study, these hints were not further examined.

            The lessons in which students read texts in the DLE replaced six regular lessons in the seventh-grade curriculum. All classrooms contained students with a mixed educational level of general secondary and pre-university education. This mixed educational level is common in the lower years of Dutch secondary education; when students are in ninth grade (about age 14), classes are split into general secondary classes (called havo) and pre-university classes (called vwo), based on students’ performance during the early secondary years. The six-week intervention started shortly after students entered secondary education. During the six lessons, which each lasted approximately fifty minutes, students read one text about the ancient Greeks and consecutively answered 10 multiple-choice questions about the text contents. There was no difference or gradual increase in difficulty levels of these questions. The text remained on-screen while the questions were answered. A delayed Historical Content Knowledge test (HICK) was administered four to six weeks after completion of the last lesson in the DLE.

            Participants

            Five history teachers – who taught eight different classrooms (M students = 25.25, SD = 2.99) – used the DLE in their history lessons. Parents or carers of all participating students were informed via a personal letter, and were given the option to refuse the use of their child’s data. This was the case for two students, whose data were deleted from all data sets. Therefore, at the start of the intervention, the total sample consisted of 197 students, of which 48.7 per cent were female (n = 96) and 51.3 per cent were male (n = 101); their average age was 12.5 years (SD = 0.44). All personal data were handled in line with General Data Protection Regulation requirements.

            Measures
            Subject-specific reading skills

            Each of the six texts was accompanied by 10 multiple-choice questions. These questions were divided into five categories of subject-specific reading skills. Based on the official requirements for the state national history examination (College for Exams, 2014), we selected five skills that are of importance in the general domain of reading comprehension, as well as for a disciplinary approach to reading history texts: (1) recognising causal relations; (2) explaining historical events; (3) generating suitable research questions; (4) ordering of concepts; and (5) perspective taking. Hence, students received 2 multiple-choice questions per skills category each week, making a total of 12 questions per skills category throughout the six-week intervention. Table 1 shows the five subject-specific reading skills and corresponding exemplary multiple-choice questions.

            Table 1.

            Subject-specific reading skills and exemplary multiple-choice questions (Source: Authors, 2022)

            Question categorySkillExample
            Cause and effectRecognising (direct and indirect) causal relations‘What can be considered a direct cause of the demise of the powerful city state of Sparta?’
            ExplainingExplaining historical events or developments‘Explain why the 300 Spartan soldiers went into battle against 10,000 Persians.’
            Generating questionsGenerating or selecting suitable research questions‘Imagine you are researching the status of women in ancient Greece. For which of the following questions can you find an answer in the current text?’
            Ordering of conceptsIdentifying chronology or important text elements‘Look at the following four elements from the text. Which of these are main ideas?’
            Perspective takingContextualisation of concepts described in texts, or actors’ points of view‘What could have been a reason for the Spartans to leave sickly babies in the mountains to die?’

            The skills defined in each question category resemble the components from the historical reasoning framework by Van Drie and Van Boxtel (2008). For example, the included subject-specific reading skill defined as ‘recognising causal relations’ is also apparent at the core of this framework, which includes ‘historical reasoning about causes and consequences’ as one of its core elements. When asked ‘What can be considered a direct cause of the demise of the powerful city state of Sparta?’ (see Table 1), students had the option to choose from: ‘(A) Too many sickly babies were left to die, reducing the number of soldiers; (B) They could not fight as much anymore, since they had already conquered a lot of cities; (C) Spartan women had become economically independent and no longer wanted to fight; or (D) The Persians defeated the Spartans and conquered Spartan territories’ – the correct answer being D. The multiple-choice questions are aimed at both first-order knowledge (for example, the Spartans lost the battle of Thermopylae) and second-order knowledge (for example, following this event, the power status of Sparta changed). The log files from the DLE enabled us to compute students’ average scores on each of the five categories by using their first attempt at answering the multiple-choice questions (0 = incorrect, 1 = correct) from Weeks 1–6.

            Historical content knowledge and historical reasoning ability

            To measure students’ historical content knowledge and reasoning ability, the first author composed an instrument to measure students’ historical content knowledge (HICK) based on the Assessment of Social Studies Knowledge (ASK) instrument by Vaughn et al. (2013). The original ASK instrument consists of two subtests: a content knowledge test composed of 46 multiple-choice items, and a reading comprehension test with 21 multiple-choice items based upon three text passages. The ASK instrument was modified so that its contents were related to the historical content covered in the current study, which was ancient Greece.

            The final HICK instrument consisted of two components. The first component consisted of 30 multiple-choice questions (MCQ) about the contents of the six texts. The HICK-MCQ items assessed first-order knowledge that was present in the text that students had read in the DLE. It measured what students learned or remembered in terms of historical facts, concepts and chronology; therefore, we consider this as a measure of students’ historical content knowledge. An example of such a question is: ‘A different name for the Greek term ekklesia is: (A) Democracy; (B) Voting rights; (C) Legislation; (D) Public assembly’. The second component included a short expository text about the Olympic Games followed by three open-ended questions (OEQ): students were asked to: (1) Identify differences between the Olympic Games in ancient Greece and the present-day Olympic Games; (2) Explain why wars were paused during the ancient Olympic Games; and (3) Connect Spartan and Athenian views on women in society to the fact that women were not allowed to compete in this event. By doing so, the OEQ component of the HICK incorporated elements from the historical reasoning framework by Van Boxtel and Van Drie (2018), such as reasoning about similarities and differences, or causes and consequences, using historical concepts and historical contextualisation. Therefore, we consider the OEQ component as a measure of students’ historical reasoning ability.

            Prior to administration, three pre-service history teachers checked the HICK instrument and found no issues with regard to its contents. We administered the HICK approximately four to six weeks after completing the last lesson. Subsequently, we analysed the internal consistency and reliability of the 30 MCQ items using the Kuder–Richardson Formula 20 (Kuder and Richardson, 1937). The yielded KR-20 value of 0.73 indicated reasonable reliability.

            The first author coded the open questions following a predetermined answer model with a maximum of 4 points for the first and third OEQ question, and 2 points for the second question, adding up to a maximum score of 10. For example, the first OEQ question was ‘Name three differences between the first Olympic Games in ancient Greece and the Olympic Games we know today’. Each valid difference was awarded 1 point, and another point was added if students explicitly mentioned the differences between the past and the present (for example, ‘In ancient Greece, women were not allowed to participate in the Games, while nowadays this is possible’). To ensure inter-rater reliability, a research assistant also coded students’ answers on all three open questions of one classroom. Cohen’s Kappa was 0.71, indicating sufficient agreement (Cohen, 1960).

            Analyses

            We analysed the data from the DLE using IBM SPSS Statistics 25. We used descriptive statistics, bivariate correlations and multiple regression analysis with forced entry to explore the relations between subject-specific reading skills, historical content knowledge and historical reasoning ability.

            Of the students, 8 did not complete all six lessons in the DLE, mostly due to sickness or classroom transfers. For these students, the average score on subject-specific reading skills could not be based on all six lessons, which could distort the analyses. Therefore, we excluded these 8 students (4.1 per cent of the total sample) from all analyses. Since the number of excluded students did not exceed 5 per cent of the total sample, we found it acceptable to apply listwise deletion (Graham, 2009). Furthermore, due to absence, a further 8 students missed the administration of the HICK questionnaire, and an additional 2 students only completed the MCQ component of the HICK questionnaire, but not the OEQ component. These 10 students did not significantly differ from the included students in terms of their scores on the five subject-specific reading skills, p > 0.05 for each skill, which substantiated our decision to exclude them from the analyses. Therefore, the final sample for the MCQ component consists of 181 students (91.8 per cent of the total sample), whereas the final sample for the OEQ component consists of 179 students (90.9 per cent of the total sample).

            Results

            Table 2 shows the descriptive statistics and the correlations between the five subject-specific reading skills and the MCQ and OEQ components of the HICK. For all subject-specific reading skills, except for perspective taking, students on average answered half of the multiple-choice questions correctly at the first attempt in the DLE. The average score on the MCQ component was 16.31 out of 30 (SD = 4.76) and 4.64 out of 10 (SD = 2.09) for the OEQ component. It is important to note that many students missed points on the OEQ because they did not read the question properly (for example, not mentioning both past and present when writing down differences, mentioning two differences when three were required, or not substantiating their answer with an explanation when the question explicitly prompted to explain).

            Table 2.

            Descriptive statistics and bivariate correlations (N = 181) (Source: Authors, 2022)

            Variable1234567
            1. Cause and effect-
            2. Explaining0.41**-
            3. Generating questions0.38**0.34**-
            4. Ordering of concepts0.32**0.22**0.26**-
            5. Perspective taking0.36**0.43**0.37**0.28**-
            6. HICK-MCQ0.38**0.37**0.41**0.31**0.34**-
            7. HICK-OEQ0.28**0.35**0.32**0.19**0.30**0.45**-
            M 0.480.530.490.480.5716.314.64
            SD 0.150.160.160.170.184.762.09

            Note: HICK = historical content knowledge; MCQ = multiple-choice questions; OEQ = open-ended questions.

            N = 179 for HICK-OEQ. *p < 0.05, two-tailed. **p < 0.01, two-tailed.

            There were positive and moderate significant correlations (p < 0.01) between all five subject-specific reading skills, as well as between each of these skills and students’ performance on both MCQ and OEQ components of the HICK. Although all correlations were significant, it should be noted that the correlations with the MCQ component were somewhat stronger compared with the OEQ component.

            To test whether the subject-specific reading skills each add something unique to explain students’ historical content knowledge and historical reasoning ability, we used multiple regression analysis with forced entry of the five predictive skills. Table 3 shows the results and the predictors of the MCQ and OEQ components. For the MCQ component, the skills we defined as ‘explaining historical events’, ‘generating historical questions’ and ‘ordering of concepts’ were all significant unique predictors, p = 0.036, p = 0.002 and p = 0.050, respectively. For the OEQ component, only ‘explaining’ and ‘generating questions’ were significant unique predictors, p = 0.012 and p = 0.041. Although there were positive correlations with the MCQ and OEQ components, the multiple regression analysis showed that the subject-specific reading skills we defined as ‘identifying cause and effect’ and ‘perspective taking’ were neither uniquely predictive for students’ historical content knowledge nor for their historical reasoning ability.

            Table 3.

            Unique predictors of HICK-MCQ and HICK-OEQ components (N = 181) (Source: Authors, 2022)

            VariableHICK-MCQ
            HICK-OEQ
            B 95% CI B 95% CI
            Constant4.87**[2.06, 7.68]0.62[–0.69, 1.94]
            Cause and effect4.55[–0.50, 9.14]1.04[–1.11, 3.20]
            Explaining4.67*[0.32, 9.00]2.63*[0.59, 4.68]
            Generating questions6.87**[2.49, 11.24]2.14*[0.09, 4.19]
            Ordering of concepts3.80*[0.01, 7.58]0.62[–1.18, 2.42]
            Perspective taking2.71[–1.29, 6.70]1.33[–0.54, 3.20]
            R 2 0.280.19
            F 13.86***8.06***

            Note: HICK = historical content knowledge; MCQ = multiple-choice questions; OEQ = open-ended questions; CI = confidence interval. N = 179 for HICK-OEQ. *p < 0.05. **p < 0.01. ***p < 0.001.

            Conclusion and discussion

            The current study explored which subject-specific reading skills contribute to students’ historical content knowledge and historical reasoning ability. Bivariate correlations showed positive and moderate significant correlations between all five subject-specific reading skills (‘cause and effect’, ‘explaining’, ‘generating questions’, ‘ordering of concepts’ and ‘perspective taking’). Although it could be expected that these skills would correlate, since they are all part of the general construct of reading comprehension, the individual subject-specific skills seemed to measure separate skills within the process of reading historical texts. For example, items measuring ‘perspective taking’ did not measure ‘generating questions’ at the same time. Additionally, all five subject-specific reading skills correlated positively with students’ historical content knowledge (the HICK-MCQ component) and historical reasoning ability (the HICK-OEQ component). These findings suggest that students’ subject-specific reading skills, which are related to general reading skills and disciplinary literacy skills, contribute to students’ historical content knowledge and historical reasoning ability, as indicated in earlier studies (Learned, 2018; Nokes et al., 2007).

            Multiple regression analyses showed that the subject-specific reading skills ‘explaining’, ‘generating questions’ and ‘ordering of concepts’ were significant unique predictors of students’ historical content knowledge, whereas only the first two skills were significant unique predictors of students’ historical reasoning ability. Earlier findings indicate the importance of causal reasoning (Stoel et al., 2015) and contextualisation (Huijgen et al., 2018; Van Boxtel and Van Drie, 2018); however, the skills we defined as ‘identifying cause and effect’ and ‘perspective taking’ did not significantly and uniquely predict historical content knowledge or historical reasoning ability. A possible explanation might be that when all five subject-specific reading skills are combined in a reading task, the multiple-choice questions addressing ‘explaining’, ‘generating questions’ and ‘ordering of concepts’ require more higher-order thinking and reasoning skills compared with, for example, recognising cause-and-effect relations in a text. Thus, a question about the explanation of a historical event is more closely related to historical content knowledge and historical reasoning ability, because students do not only use the textual information to answer the question, but also their prior knowledge and reasoning skills.

            On average, students answered half of the multiple-choice questions related to the five subject-specific skills correctly. Similarly, students answered on average half of the items on the delayed historical content knowledge test correctly. One might consider these results poor, since in general in the Dutch educational system, a score of about 50 per cent would result in an insufficient grade. However, given the fact that the HICK instrument was administered unannounced, and four to six weeks after the last lesson in the DLE, these results were in line with our expectations. In addition, the digital nature of the reading task implied that there was no possibility to collaborate, discuss or take notes during reading – which are processes that have been proven effective for student learning in history classes (Reisman, 2012). Nevertheless, students remembered quite a few dates, facts and definitions, even though there was no grading or reward system involved. Since there are no other studies that used the same HICK instrument to measure delayed historical content knowledge and historical reasoning ability, we cannot compare our results with earlier studies, and we do not know whether students improved their historical content knowledge or historical reasoning ability. Future studies might, for example, adopt a quasi-experimental design that includes a pre-test measurement of students’ historical content knowledge and historical reasoning ability to analyse the effectiveness of students’ subject-specific reading skills in more detail.

            Limitations and suggestions for future research

            Due to the practice-oriented approach of this study, in which teachers used the DLE in authentic classroom settings, there are some limitations in relation to the statistical analyses. Although we carefully selected the contents of the texts and assignments in the DLE, we could not control for the possible influence of teachers’ regular instruction on students’ historical content knowledge. It is likely that students’ performance on the historical content knowledge test was not only related to the practising of their subject-specific reading skills in the DLE, but also to the regular classroom instruction they received. And, as Cheung and Slavin (2012) state, the use of technology to support and facilitate teachers’ instruction is probably more effective than the use of technology in itself. However, teachers did not integrate the texts from the DLE with their regular curriculum, nor did they provide elaborate reading strategy instruction to their students during lesson observations (ter Beek et al., 2019). Nevertheless, future research could control for the possible influence of classroom instruction by including quantified observations of teachers’ instructional practices as a background variable.

            The same applies to the possible influence of students’ engagement with the digital reading task at hand. Greene et al. (2010) showed that in hypermedia learning environments used to read historical texts, students often found it difficult to engage in learning processes that could foster their historical knowledge. The students in the current study used a DLE to read historical texts, but it is unclear whether this format caused a distraction that may have hampered their reading process, and, consequently, their historical reasoning ability. Future studies could investigate the role of student engagement when using a DLE in relation to their reading skills and historical reasoning ability. Furthermore, given that the DLE focused on individual reading, and only included texts, assessments and optional hints, the DLE might be enhanced with features that stimulate the (collaborative) reading and learning process of students, such as chat options or discussion boards.

            The HICK instrument used to measure both historical content knowledge and historical reasoning ability provoked another limitation, as it was specifically designed for the current study, but research has shown that constructing a valid and reliable instrument to measure historical reasoning processes is a highly challenging task (see Huijgen et al., 2018). Although the approach used in the current study provided us with relevant results, additional and robust tests, factor analyses and larger samples are needed to further validate the instrument to more robustly determine the influence of subject-specific reading skills on students’ historical content knowledge and historical reasoning ability. Moreover, even though the HICK was essentially based on the ASK instrument by Vaughn et al. (2013), the OEQ component aims to measure historical reasoning, whereas the original open-ended questions in the ASK instrument focus more on text comprehension. The correlations between the five subject-specific reading skills and the OEQ were less strong compared with the correlations with the MCQ component, which is probably influenced by the fact that the MCQ component resembled the multiple-choice questions in the DLE. Moreover, the format of the OEQ also appealed to students’ writing skills. Earlier studies have shown that students’ initial writing ability is related to the quality of their written text, which might have led to better answers on the OEQ items, and, subsequently, higher scores (see De La Paz and Felton, 2010; Van Drie et al., 2015). Future research could consider using sophisticated measures of students’ historical reasoning ability that are not based solely on written answers, for example think-aloud protocols, to uncover students’ reasoning process while or after reading historical texts.

            Last, it is important to bear in mind other skills that contribute to students’ historical reasoning competency, such as their understanding of metahistorical concepts and historical strategies, knowledge of the nature and construction of historical knowledge, and knowledge of the use of historical knowledge (Van Boxtel and Van Drie, 2018) – in other words, to focus on how students progress from ‘know what’ and ‘know how’ skills to the level of ‘know to be’ skills, in which they use the subject matter to behave or think in certain contexts or situations (Luís and Rapanta, 2020).

            Scientific and practical implications

            This study shows the applicability of the historical reasoning framework by Van Boxtel and Van Drie (2018) within the context of reading historical texts in lower secondary education. When core components from this framework are translated into subject-specific reading skills, for example when students digitally practise with reading texts, it is possible to investigate which components are attainable for students of a specific age group. Our results show that the skills we defined as ‘explaining historical events’ and ‘generating historical questions’ contributed both to the historical content knowledge and to the historical reasoning ability of seventh-grade students.

            The findings indicate that the history curriculum in lower secondary education should promote students’ reading skills to promote students’ historical content knowledge and historical reasoning ability throughout their academic careers. For example, history teachers could use expository texts as a basis for their instruction, followed by classroom discussions (see Wanzek et al., 2015) about possible explanations for historical events, relevant historical questions to ask about a text, or different perspectives encountered in these texts (McKeown et al., 2009). In addition, history teachers and language teachers could join forces with regard to reading strategy instruction, to ensure that the skills students learn in language classrooms (for example, Dutch or English language courses) apply to their history course. Since primary and secondary textual sources are crucial for history education, training in these subject-specific reading skills in lower secondary education provides students with a sound base for the higher secondary levels and beyond.

            Funding

            This work was supported by the Netherlands Initiative for Education Research (NRO; grant number 405-15-551). The funding source was involved in neither the execution of the research nor the preparation of the manuscript.

            Acknowledgements

            We thank research assistants Hidde Ozinga and Marida Prins from the Faculty of Behavioural and Social Sciences, University of Groningen, The Netherlands, for their help in obtaining the data. Additionally, we thank all of the history teachers, students and school coordinators for their participation in this research project.

            Declarations and conflicts of interest

            Research ethics statement

            This study obtained ethical approval from a governmental review board involved in assessing the grant application.

            Consent for publication statement

            The authors declare that research participants’ informed consent to publication of findings – including photos, videos and any personal or identifiable information – was secured prior to publication.

            Conflicts of interest statement

            The authors declare no conflicts of interest with this work. All efforts to sufficiently anonymise the authors during peer review of this article have been made. The authors declare no further conflicts with this article.

            References

            1. Afflerbach P, VanSledright B. 2001. Hath! Doth! What?: Middle graders reading innovative history text. Journal of Adolescent and Adult Literacy. Vol. 44(8):696–707. https://www.jstor.org/stable/40018742

            2. Alexander PA. 2003. The development of expertise: The journey from acclimation to proficiency. Educational Researcher. Vol. 32(8):10–14. [Cross Ref]

            3. Azevedo R, Gašević D. 2019. Analyzing multimodal multichannel data about self-regulated learning with advanced learning technologies: Issues and challenges. Computers in Human Behavior. Vol. 96:207–10. [Cross Ref]

            4. Cheung AC, Slavin RE. 2012. How features of educational technology applications affect student reading outcomes: A meta-analysis. Educational Research Review. Vol. 7(3):198–215. [Cross Ref]

            5. Cohen J. 1960. A coefficient of agreement for nominal scales. Educational and Psychological Measurement. Vol. 20(1):37–46. [Cross Ref]

            6. College for Exams. 2014. Geschiedenis vwo: Syllabus centraal examen 2016 op basis van domein A en B van het examenprogramma (dus geen tijdelijke afwijking meer) [Dutch central state exam syllabus, 2016]. Accessed 1 February 2022 https://www.examenblad.nl/examenstof/syllabus-2016-geschiedenis-vwo/2016/vwo/f=/geschiedenis_vwo_def_versie_2016_aanpassing_Verona.pdf

            7. De La Paz S, Felton MK. 2010. Reading and writing from multiple source documents in history: Effects of strategy instruction with low to average high school writers. Contemporary Educational Psychology. Vol. 35(3):174–92. [Cross Ref]

            8. Devolder A, Van Braak J, Tondeur J. 2012. Supporting self-regulated learning in computer-based learning environments: Systematic review of effects of scaffolding in the domain of science education. Journal of Computer Assisted Learning. Vol. 28(6):557–73. [Cross Ref]

            9. Duhaylongsod L, Snow CE, Selman RL, Donovan MS. 2015. Toward disciplinary literacy: Dilemmas and challenges in designing history curriculum to support middle school students. Harvard Educational Review. Vol. 85(4):587–608. [Cross Ref]

            10. Faggella-Luby MN, Graner PS, Deshler DD, Drew SV. 2012. Building a house on sand: Why disciplinary literacy is not sufficient to replace general strategies for adolescent learners who struggle. Topics in Language Disorders. Vol. 32(1):69–84. [Cross Ref]

            11. Fry SW, Gosky R. 2007. Supporting social studies reading comprehension with an electronic pop-up dictionary. Journal of Research on Technology in Education. Vol. 40(2):127–39. [Cross Ref]

            12. Gestsdóttir SM, Van Boxtel C, Van Drie J. 2018. Teaching historical thinking and reasoning: Construction of an observation instrument. British Educational Research Journal. Vol. 44(6):960–81. [Cross Ref]

            13. Girard B, McArthur Harris L. 2012. Striving for disciplinary literacy instruction: Cognitive tools in a world history course. Theory and Research in Social Education. Vol. 40(3):230–59. [Cross Ref]

            14. Goldman SR, Britt MA, Brown W, Cribb G, George M, Greenleaf C, Lee CD, Shanahan C; Project READI. 2016. Disciplinary literacies and learning to read for understanding: A conceptual framework for disciplinary literacy. Educational Psychologist. Vol. 51(2):219–46. [Cross Ref]

            15. Graham JW. 2009. Missing data analysis: Making it work in the real world. Annual Review of Psychology. Vol. 60:549–76. [Cross Ref]

            16. Greene JA, Bolick CM, Robertson J. 2010. Fostering historical knowledge and thinking skills using hypermedia learning environments: The role of self-regulated learning. Computers and Education. Vol. 54(1):230–43. [Cross Ref]

            17. Huijgen T, Van de Grift W, Van Boxtel C, Holthuis P. 2018. Promoting historical contextualization: The development and testing of a pedagogy. Journal of Curriculum Studies. Vol. 50(3):410–34. [Cross Ref]

            18. Hynd C, Holschuh JP, Hubbard BP. 2004. Thinking like a historian: College students’ reading of multiple historical documents. Journal of Literacy Research. Vol. 36(2):141–76. [Cross Ref]

            19. Kirschner PA, Sweller J, Clark RE. 2006. Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist. Vol. 41(2):75–86. [Cross Ref]

            20. Kuder GF, Richardson MW. 1937. The theory of the estimation of test reliability. Psychometrika. Vol. 2(3):151–60. [Cross Ref]

            21. Lan Y-C, Lo Y-L, Hsu Y-S. 2014. The effects of meta-cognitive instruction on students’ reading comprehension in computerized reading contexts: A quantitative meta-analysis. Journal of Educational Technology & Society. Vol. 17(4):186–202. https://www.jstor.org/stable/jeductechsoci.17.4.186

            22. Learned JE. 2018. Doing history: A study of disciplinary literacy and readers labeled as struggling. Journal of Literacy Research. Vol. 50(2):190–216. [Cross Ref]

            23. Lee P, Ashby R. 2000. Progression in historical understanding among students aged 7–14Stearns PN, Seixas P, Wineburg S. Knowing, Teaching, and Learning History: National and international perspectives. New York: New York University Press. p. 199–222

            24. Luís R, Rapanta C. 2020. Towards (re-)defining historical reasoning competence: A review of theoretical and empirical research. Educational Research Review. Vol. 31:100336. [Cross Ref]

            25. Lynch L, Fawcett AJ, Nicolson RI. 2000. Computer-assisted reading intervention in a secondary school: An evaluation study. British Journal of Educational Technology. Vol. 31(4):333–48. [Cross Ref]

            26. Mastropieri MA, Scruggs TE, Graetz JE. 2003. Reading comprehension instruction for secondary students: Challenges for struggling students and teachers. Learning Disability Quarterly. Vol. 26(2):103–16. [Cross Ref]

            27. McKeown MG, Beck IL, Blake RGK. 2009. Rethinking reading comprehension instruction: A comparison of instruction for strategies and content approaches. Reading Research Quarterly. Vol. 44(3):218–53. https://www.jstor.org/stable/25655454

            28. Moje EB. 2008. Foregrounding the disciplines in secondary literacy teaching and learning: A call for change. Journal of Adolescent and Adult Literacy. Vol. 52(2):96–107. [Cross Ref]

            29. Moje EB. 2015. Doing and teaching disciplinary literacy with adolescent learners: A social and cultural enterprise. Harvard Educational Review. Vol. 85(2):254–78. [Cross Ref]

            30. Monte-Sano C. 2011. Beyond reading comprehension and summary: Learning to read and write in history by focusing on evidence, perspective, and interpretation. Curriculum Inquiry. Vol. 41(2):212–49. [Cross Ref]

            31. Monte-Sano C, De La Paz S, Felton M. 2014. Implementing a disciplinary-literacy curriculum for US history: Learning from expert middle school teachers in diverse classrooms. Journal of Curriculum Studies. Vol. 46(4):540–75. [Cross Ref]

            32. Moran J, Ferdig RE, Pearson PD, Wardrop J, Blomeyer RL Jr. 2008. Technology and reading performance in the middle-school grades: A meta-analysis with recommendations for policy and practice. Journal of Literacy Research. Vol. 40(1):6–58. [Cross Ref]

            33. National Reading Panel. 2000. Teaching Children to Read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction. National Institute of Child Health and Human Development. Accessed 6 February 2022 https://www.nichd.nih.gov/sites/default/files/publications/pubs/nrp/Documents/report.pdf

            34. Nokes JD. 2011. Recognizing and addressing the barriers to adolescents’ “reading like historians”. The History Teacher. Vol. 44(3):379–404. https://www.jstor.org/stable/41303991

            35. Nokes JD, Dole JA, Hacker DJ. 2007. Teaching high school students to use heuristics while reading historical texts. Journal of Educational Psychology. Vol. 99(3):492–504. [Cross Ref]

            36. Okkinga M, Van Steensel R, Van Gelderen AJS, Van Schooten E, Sleegers PJC, Arends LR. 2018. Effectiveness of reading-strategy interventions in whole classrooms: A meta-analysis. Educational Psychology Review. Vol. 30(4):1215–39. [Cross Ref]

            37. O’Neill DK, Weiler MJ. 2006. Cognitive tools for understanding history: What more do we need? Journal of Educational Computing Research. Vol. 35(2):181–97. [Cross Ref]

            38. Palincsar AS, Brown AL. 1984. Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction. Vol. 1(2):117–75. [Cross Ref]

            39. Perfetti CA, Britt MA, Georgi MC. 1995. Text-Based Learning and Reasoning: Studies in history. Erlbaum:

            40. Poitras E, Lajoie S, Hong Y-J. 2012. The design of technology-rich learning environments as metacognitive tools in history education. Instructional Science. Vol. 40(6):1033–61. [Cross Ref]

            41. Ramsay CM, Sperling RA, Dornisch MM. 2010. A comparison of the effects of students’ expository text comprehension strategies. Instructional Science. Vol. 38(6):551–70. [Cross Ref]

            42. Reisman A. 2012. Reading like a historian: A document-based history curriculum intervention in urban high schools. Cognition and Instruction. Vol. 30(1):86–112. [Cross Ref]

            43. Shanahan T, Shanahan C. 2008. Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review. Vol. 78(1):40–59. [Cross Ref]

            44. Shanahan C, Shanahan T, Misischia C. 2011. Analysis of expert readers in three disciplines: History, mathematics, and chemistry. Journal of Literacy Research. Vol. 43(4):393–429. [Cross Ref]

            45. Stoel GL, Van Drie JP, Van Boxtel CAM. 2015. Teaching towards historical expertise: Developing a pedagogy for fostering causal reasoning in history. Journal of Curriculum Studies. Vol. 47(1):49–76. [Cross Ref]

            46. ter Beek M, Brummer L, Donker AS, Opdenakker M-CJL. 2018a. Supporting secondary school students’ reading comprehension in computer environments: A systematic review. Journal of Computer Assisted Learning. Vol. 34(5):557–66. [Cross Ref]

            47. ter Beek M, Spijkerboer AW, Brummer L, Opdenakker M-C. 2018b. Gemotiveerd, actief en zelfstandig lezen: Hoe een digitale leeromgeving zowel de leerling als de docent kan ondersteunen bij het begrijpend lezen van informatieve zaakvakteksten in het voortgezet onderwijs’ [Motivated, active and independent reading: How a digital learning environment can support both students and teachers in the context of comprehensive reading of expository content area texts in secondary education]. The Netherlands: Rijksuniversiteit Groningen. Accessed 6 February 2022 https://research.rug.nl/en/publications/motivated-active-and-independent-eading-how-a-digital-learning-e

            48. ter Beek M, Opdenakker M-C, Deunk MI, Strijbos JW. 2019. Teaching reading strategies in history lessons: A micro-level analysis of professional development training and its practical challenges. Studies in Educational Evaluation. Vol. 63:26–40. [Cross Ref]

            49. VanSledright BA. 2004. What does it mean to read history?: Fertile ground for cross-disciplinary collaborations? Reading Research Quarterly. Vol. 39(3):342–6. https://www.jstor.org/stable/4151775

            50. VanSledright B, Limón M. 2006. Learning and teaching social studies: A review of cognitive research in history and geographyAlexander PA, Winne PH. Handbook of Educational Psychology. Mahwah, NJ: Erlbaum. p. 545–70

            51. Van Boxtel C, Van Drie J. 2018. Historical reasoning: Conceptualizations and educational applicationsMetzger SA, McArthur Harris L. The Wiley International Handbook of History Teaching and Learning. Hoboken, NJ: Wiley. p. 149–76. [Cross Ref]

            52. Van Drie J, Van Boxtel C. 2008. Historical reasoning: Towards a framework for analyzing students’ reasoning about the past. Educational Psychology Review. Vol. 20(2):87–110. [Cross Ref]

            53. Van Drie J, Braaksma M, Van Boxtel C. 2015. Writing in history: Effects of writing instruction on historical reasoning and text quality. Journal of Writing Research. Vol. 7(1):123–56. [Cross Ref]

            54. Vaughn S, Swanson EA, Roberts G, Wanzek J, Stillman-Spisak SJ, Solis M, Simmons D. 2013. Improving reading comprehension and social studies knowledge in middle school. Reading Research Quarterly. Vol. 48(1):77–93. [Cross Ref]

            55. Wanzek J, Swanson EA, Roberts G, Vaughn S, Kent SC. 2015. Promoting acceleration of comprehension and content through text in high school social studies classes. Journal of Research on Educational Effectiveness. Vol. 8(2):169–88. [Cross Ref]

            56. Wineburg SS. 1991. On the reading of historical texts: Notes on the breach between school and academy. American Educational Research Journal. Vol. 28(3):495–519. [Cross Ref]

            57. Wineburg SS. 1998. Reading Abraham Lincoln: An expert/expert study in the interpretation of historical texts. Cognitive Science. Vol. 22(3):319–46. [Cross Ref]

            58. Wineburg SS. 2001. Historical Thinking and Other Unnatural Acts: Charting the future of teaching the past. Philadelphia, PA: Temple University Press.

            59. Wineburg SS. 2018. Why Learn History (When It’s Already on Your Phone). Chicago: University of Chicago Press.

            60. Wineburg S, Reisman A. 2015. Disciplinary literacy in history: A toolkit for digital citizenship. Journal of Adolescent and Adult Literacy. Vol. 58(8):636–9. [Cross Ref]

            61. Wissinger D, Ciullo S, Shiring E. 2018. Historical literacy instruction for all learners: Evidence from a design experiment. Reading and Writing Quarterly. Vol. 34(6):568–86. [Cross Ref]

            62. Wissinger DR, De La Paz S, Jackson C. 2020. The effects of historical reading and writing strategy instruction with fourth- through sixth-grade students. Journal of Educational Psychology. Vol. 113(1):49–67. [Cross Ref]

            Author and article information

            Journal
            herj
            History Education Research Journal
            UCL Press (UK )
            2631-9713
            24 February 2022
            : 19
            : 1
            : e19102
            Affiliations
            [1 ]Researcher, Educational Support and Innovation Department, University of Groningen, The Netherlands
            [2 ]Associate Professor of Education, Chair Group Education, University of Humanistic Studies, The Netherlands, and Associate Professor and Rosalind Franklin Fellow, GION Institute for Educational Research, Faculty of Behavioural and Social Sciences, University of Groningen, The Netherlands
            [3 ]Assistant Professor, GION Institute for Educational Research, Faculty of Behavioural and Social Sciences, University of Groningen, The Netherlands
            [4 ]Full Professor of Learning and Instruction, GION Institute for Educational Research, Faculty of Behavioural and Social Sciences, University of Groningen, The Netherlands
            [5 ]Assistant Professor and Teacher Educator, Department of Teacher Education, Faculty of Behavioural and Social Sciences, University of Groningen, The Netherlands
            Author notes
            Author information
            https://orcid.org/0000-0002-8021-0109
            https://orcid.org/0000-0001-7079-0245
            https://orcid.org/0000-0003-4057-559X
            https://orcid.org/0000-0003-4180-688X
            https://orcid.org/0000-0003-4426-6903
            Article
            10.14324/HERJ.19.1.02
            9d8670f4-5f10-4cce-8825-cc7256aca76b
            Copyright © 2022, Marlies ter Beek, Marie-Christine Opdenakker, Marjolein I. Deunk, Jan-Willem Strijbos and Tim Huijgen

            This is an open-access article distributed under the terms of the Creative Commons Attribution Licence (CC BY) 4.0 https://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, distribution and reproduction in any medium, provided the original author and source are credited.

            History
            : 21 September 2021
            : 30 January 2022
            Page count
            Figures: 2, Tables: 3, References: 62, Pages: 16
            Categories
            Article

            Educational research & Statistics,General education,History
            secondary education,content knowledge,historical reasoning,subject-specific reading skills,history education,digital learning environment

            Comments

            Comment on this article