Multiple Perspectives on Digital Literacies Research Methods in Canada

Michelle Hagerman, Pamela Beach, Megan Cotnam-Kappel, and Cristyne Hébert

VOL. 35, No. 1, 2020

Abstract : In this article, we call for Canadian digital literacies researchers to invest in designs and research methods that centralise in-the-moment insights, embrace complexity, and that are informed by a deep commitment to authentic, ethical reciprocity that serves the communities in which our work is placed. We present three cases that offer multiple perspectives for how we might operationalise these principles, and we consider implications for the use of data collected with new approaches to digital literacies assessment, with virtual retrospective think alouds, eye-tracking, and spy glasses video. As the first co-authored article by The Digital Literacies Collective, this article contributes our shared position on the methodological priorities that will enable Canadian digital literacies researchers to construct new, contextually-situated frameworks that inform digital literacies policies and practices in Canadian systems of schooling.

Keywords: digital literacies; research methods; Canada; think aloud; eye tracking; spy glasses; assessment

Résumé: Dans cet article, nous appelons les chercheurs canadiens en littératie numérique à investir dans des conceptions et des méthodes de recherche qui centralisent les connaissances instantanées, embrassent la complexité et sont éclairées par un engagement profond envers une réciprocité authentique et éthique au service des communautés dans lesquelles se situe notre travail. Nous présentons trois cas qui offrent de multiples perspectives sur la façon dont nous pourrions opérationnaliser ces principes, et considérons les implications pour l'utilisation des données collectées avec de nouvelles approches d'évaluation des littératies numériques, avec des rétrospectives virtuelles de réflexion à haute voix, des suivis du mouvement des yeux, des vidéos enregistrées par lunettes d'espionnage. En tant que premier article co-écrit par le Collectif des littératies numériques, cet article défend notre position commune concernant les priorités méthodologiques qui permettront aux chercheurs canadiens en littératies numériques de construire de nouveaux cadres contextuels qui éclairent les politiques et les pratiques des littératies numériques dans les systèmes scolaires canadiens.

Mots-clés: littératies numériques; méthodes de recherche; Canada; réfléchir à haute voix; suivi de l'oeil; lunettes d'espion; évaluation

This work is licensed under a Creative Commons Attribution 3.0 Unported License.

Introduction

Current digital literacies scholarship in diverse contexts of Canadian schooling is advancing conceptions of digital literacies practices and pedagogies (e.g., Rowsell et al., 2013; Watt, 2019). Although scholars in this field leverage a range of methods in the construction of new knowledges (Brownell & Wargo, 2017; Coiro, 2020; Knobel & Lankshear, 2017; Wargo, 2019), opportunities for deep discussion on the methodological innovations, limitations, and possibilities that frame our research designs, data collection, and analytical decision-making processes are rare. Indeed, as a group of emerging Canadian scholars with research interests at the intersections of literacies, technologies, teaching, and learning, the initial plan for this work was to come together at the pre-conference meeting of the Technology and Teacher Education (TATE) Special Interest Group at the Canadian Society for the Study of Education (CSSE) Annual Conference (May 30–June 4, 2020 in London, ON) to explore current conceptions of digital literacies research methods in Canada. Although the 2020 conference was cancelled because of COVID-19, our hope for this article is to inspire others to contribute to the evidence-base of digital literacies research methods from which we all might draw. Ideally, a shared investment in digital literacies research methods should advance research in this field in Canada in ways that move us beyond continued reliance on the presumed newness of digital literacy research (Stornaiuolo et al., 2014) and also beyond the methodological, epistemological, and theoretical assumptions grounded in the practical and cultural contexts of other places. To construct data-sets in and for Canadian classrooms, and with Canadian teachers and students, means that we must centralize the ways that digital tools, digital methods, and digitally-mediated meaning-making are shaped by Canadian places, and by the uniquely rhizomatic nature of sense-making with multiple languages, and across the diverse contexts and geographies of Canadian schooling (Bhatt et al., 20015; Marshall & Marr, 2018; McLean & Rowsell, 2020; Prinsloo & Krause, 2019).

Our work focuses on digital literacies teaching and learning in Canadian K-12 contexts. Collectively, we view digital literacies as socially situated practices for meaning making with and from texts, in all of their forms, in digital environments (Lankshear & Knobel, 2008; Spires et al., 2012). We understand that purposes (e.g., finding information, sharing or curating information, creating a video), technologies (e.g., search engines, social media platforms, digital gaming environments), and modalities (e.g., audio, video, images) (Kress, 2003) frame meaning-making practices (Hartman et al., 2018), including the skills and strategies that individuals are likely to use as they construct understandings that often include multiple, multimodal information sources (Cho & Afflerbach, 2017; Hartman et al., 2010). Informed by Bhatt et al. (2015), we also assert that digital literacies practices “go beyond what has traditionally been understood as mere activities where text has a role, to ones where ‘digital codification’ and ‘digital enculturation’ are central” (p. 481). To construct deeper and more expansive understandings of digital literacies practices as taught, lived, and enculturated in contexts of Canadian schooling, we identify the need for an expanded set of research methods that enable us to capture, analyze, and situate digitally encoded literacies practices.

As researchers, we often work directly with students and teachers in schools. In this article, we speak to the importance of methodological approaches that centre on learning in-context; that respond to the needs of students in particular classroom environments under the guidance of teachers who know them best; that capture a wide range of multimodal practices, competencies, and strategies as students (and their teachers) engage in digital literacies learning; and that focus on process, allowing for a rich and nuanced understanding of digital literacies learning and teaching in action. In this way, our call for an expanded set of digital literacies research methods—which we understand to be ways of gathering and analyzing data using digital tools—also implies the need for research designs that are epistemologically diverse and that might also therefore reflect a range of methodological orientations and approaches (cf., Duke & Mallette, 2001).

Ultimately, we argue that at this moment in the field of digital literacies research in Canada, we need to leverage multiple methods and methodological approaches that are informed by context; are responsive to in-the-moment needs of participants; are fluid, flexible, and authentic; and enable researchers, teachers, and students to embrace the messiness of complex learning processes. Although the methods we explore in this article have been used in other contexts to answer a range of research questions, we see that a collective effort to deepen understandings of these methods and their uses in Canadian contexts could advance important, new, place-based understandings of what it means to become digitally literate in systems of Canadian schooling today.

As researchers engaged in three separate projects, but retrospectively thinking and working through some methodological questions together, we present three different cases and explore the implications of three methods for future research in digital literacies teaching and learning. First, Cristyne Hébert details the limitations of traditional research paradigms for capturing the complex digital literacies learning that takes place in K-12 classroom spaces. She advocates for the use of methods that centralise authenticity and flexibility, and that are grounded in ethical commitments to reciprocity, collaboration, and service. Second, Pamela Beach discusses how retrospective virtual think aloud and eye tracking methods can be used to capture elementary teachers’ digital literacies practices as they use and learn from online and multimedia resources. These methods capture the moment-to-moment variations of teachers’ literacies practices as they seek out information related to their professional learning. Third, Megan Cotnam-Kappel and Michelle Hagerman present the innovations and puzzles that arose in their use of wearable video “spy glasses” to capture students’ literacies practices during online inquiry and maker-based activities. In our discussion, we synthesize the implications and transformative potential of these methods for digital literacies research in Canada.

Three Cases of Digital Literacies Research Methods

Authentic Assessments: Evaluating Student Learning in the Digital Literacies

Classroom

          Background/Context. The research discussed in this section focuses on evaluating students’ digital literacies learning in K-12 classroom contexts. In my own research, I, Cristyne, have worked with teachers to support students as they engage with maker education projects, digital game-based learning, or digital storytelling (see Bergstrom, Jenson, Flynn-Jones, & Hébert, 2018; Fong, Jenson, & Hébert, 2018; Hébert & Jenson, 2019, 2020). Aligned with the work of other digital literacies researchers who use multiple methods as means for data collection (often under the label ethnography) (Hughes, 2017; Smythe & Neufeld, 2010; Tan et al., 2013), I have conducted interviews and classroom observations paired with pre- and post-tests and analyses of the digital artefacts young people have constructed in order to assess student learning. What has struck me are the limitations of these models as they diverge from so-called “best practices” in classroom-based assessment and teacher-led classroom assessment.

In what follows, I offer some suggestions for conducting digital literacies research that focuses on authenticity, making a case for what it might look like to engage in ethical assessments of student learning in classrooms. I do not provide a tidy methodological account. Rather, my proposal calls us back into the messiness of the classroom, while advocating for slow, careful methods that respect the uncertainty that awaits researchers at each research site, respond to the needs of specific teachers and schools, and prioritize process-based and student-oriented approaches. As the work necessarily involves teachers, it recognizes the limitations of schooling with all of its tensions and contradictions. Ideal practices cannot always transpire here, and they may not always be appropriate. I end the section with some practical limitations of doing this work.

Ethically Authentic Classroom-Based Research

For digital literacies learning to be both effective and sustainable, it requires extended engagement and know-how from educators who will support student understanding on a daily basis. This means advocating for professional development and training for teachers, as well as providing direct support for educators in classrooms, when possible, to ensure that student learning is bolstered in meaningful ways. Parachute research (Smith, 2018), when researchers drop into classrooms, deliver digital literacies programming, gather data, and leave, is problematic as it is primarily extractive. It is also additive, insofar as teachers are denied the opportunity to develop competencies in pedagogical practices supportive of digital literacy learning. Along the same lines, I argue that using extractive data collection tools, that are distinct from classroom assessment, positions research as extraneous to classroom practice; assessment of student learning is externally imposed on students by researchers and cannot be used by teachers to inform evaluation of their students’ learning after a period of instruction.

What is necessary instead is a certain methodological ethicality, represented through a commitment to fidelity and reciprocity with regard to assessment. Fidelity “begins with the researcher’s commitment to the teacher(s) and co-investigator(s), who must believe that the collaboration will benefit [their] learning and [their] students” (Schulz et al., 1997, p. 482). For assessment purposes, fidelity might refer to a congruence between research and classroom assessment practices, so that the former serves in support of the latter. Reciprocal research relationships value mutuality, as “each contributes something the other needs or desires” (Trainor & Bouchard, 2013, p. 986). Guba and Lincoln’s (1989) notion of authentic qualitative research is also useful to consider here. Authentic qualitative research is fair, with buy-in from all parties; ontologically authentic, as it raises levels of awareness about the complexity of a particular issue for participants; educatively authentic, insofar as it aims to expand the perspectives of participants; catalytically authentic, driving participants to action; and tactically authentic, empowering participants to act (Shannon & Hambacher, 2014). In an authentic approach to digital literacies research, researchers and teachers are committed to gaining deeper understanding of digital literacies learning with a specific group of students in mind. Both the researcher and educator are reflective, acknowledging challenges with supporting digital literacies learning in the classroom, alongside the limitations of traditional and authentic forms of assessment. Researchers approach research as a collaborative process (van Kraayenoord et al., 2011), respecting the expertise of teachers and the in-depth understanding they have of the needs of the individuals in their class, while scaffolding teacher understandings via their own digital literacies proficiencies. With a sense of ownership over their own learning process, teachers may deepen their understanding of the types of modifications that need to be applied to classroom practice and have the agency to alter their practices in-action, when possible (Schon, 1983, 1987). Importantly, assessment of students’ digital literacies learning must also maintain a sense of fidelity to best practices in assessment, which will be discussed in the next section.

Authentic Assessment to Assess Student Learning: Best Practices as the Ideal

When conducted properly, assessment is an integral part of the learning process and is used as a means of supporting competency development rather than retrospectively evaluating the attainment of said competencies (Gulikers et al., 2004; Lund, 2008). In Canadian educational systems, a push has been made to adopt assessment practices in K-12 classrooms that recognize this framing of assessment as process, that ensure assessments are authentic, and that utilize assessment as and for learning (British Columbia Ministry of Education, 2013; New Brunswick Ministry of Education, 2020; Ontario Ministry of Education, 2010). Authentic assessment falls under the umbrella of authentic learning, which refers to designing opportunities for students to construct knowledge through a process of inquiry that connects to life outside of the classroom (Newmann et al., 1995; Renzulli et al., 2004). Authentic assessment tasks require in-depth examination of an issue at hand and the application of problem-solving skills, likely with the aim of responding to a pressing need or concern (Mayer, 2002; Villarroel et al., 2018). Constructivist and experientially-based approaches to learning typically align with authentic assessment, as learners are required to construct their own meanings, often through play and risk-taking (Olusegun, 2015; Riegler, 2011), and project-based or problem-based learning, where students direct their own inquiry (Bell, 2010; Roach et al., 2018).

Also popularized is assessment as learning, which actively involves students and their peers in assessment as they reflect on learning, monitor understandings, and enhance self- and co-regulation. The aim here is to develop student autonomy and agency over the learning process (Carless & Boud, 2018; Panadero et al., 2016; Topping, 2009). Similarly, assessment for learning establishes an open and ongoing chain of communication between student(s) and teacher about a student’s progress toward particular learning goals, which creates a continual feedback loop (Black et al., 2006; Black & Wiliam, 2009; Carless, 2019). Assessment of this kind, and accompanying summative tasks, can be easily differentiated to target the strengths and interests, readiness, and learning profile of individual students (Hansen & Imse, 2016; Jones Miller, 2013; Subban, 2006; Tomlinson & Moon, 2013). Assessments for learning might be rather informal, unstructured, and unplanned (Baird et al., 2017; Bennett, 2011).

Although digital literacies assessment remains an underexplored area of research (Burke & Rowsell, 2007; Lotherington & Ronda, 2012), authentic assessment practices are, in many ways, supportive of digital literacies learning, especially in contrast to traditional assessments. Traditional assessments are largely print-centric, individualistic, monolingual, and reliant on recall and rote memorization (e.g., tests). These assessments typically undermine multimodal learning, as representation in other modes (e.g., visual, auditory) does not always translate to print-based text. Interactivity in digital contexts, a fundamental element of the learning process, is rarely captured (Egenfeldt-Nielson et al., 2016; Jewitt, 2003). Here, assessments might become “divergent objects,” detached from the learning task and misaligned with 21st century outcomes (Aagaard & Lund, 2013; Schifter & Stewart, 2010; Silseth & Gilje, 2019).

Importantly, process- and student-oriented approaches to assessment have the potential to provide a depth of insight into student learning in-the-moment and capture learning across a broad spectrum. Using maker education as an example, examining the e-textile a student has produced tells me less about their learning than looking through various sketches from the planning stage, watching them present their prototype to and receive feedback from peers, and listening to a teacher-student conference session as the student makes sense of coding and circuitry. Similarly, comparing two students’ e-textile projects is not worthwhile without knowledge of each student’s strengths and interests, readiness, and learning profile; a novice who can patch together a barely functioning e-textile may have learned much more about electronic components that an expert who constructs a fully functioning piece of smart technology. The work of assessment within and between classrooms, then, is incongruously varied. Best practices in assessment have begun to address the necessity of accounting for these differences but more work is needed to develop flexible, evidence-informed assessment frameworks on which teachers (and researchers) can rely to support development of complex digital literacies practices.

Implications

This approach is not without limitations. One apparent obstacle is its time-consuming nature with respect to design, data collection, and analysis. To do this work well, researchers will need to spend quite a bit of time discussing and reviewing assessment procedures with individual classroom teachers, and become rather immersed within the classroom environment. The approach is also deeply context-dependent. The wide variety in assessment techniques adopted in the classroom and varying affordances made to support individual student learning will make comparison across contexts challenging. That said, these limitations are also its strengths; the approach requires a certain slowness with respect to research. Fidelity, reciprocity, and authenticity in research are necessarily reciprocal; relationship-building, sitting and being with others in relation, takes time to do well. And given the vast differences in classroom compositions across Canada, being attuned to context is necessary so as to not heedlessly generalize.

In many ways, what I have offered here is not a revolutionary approach, and it might be described as a sort of loosely ethnographic process. What I have attempted to do is provide a sketch of a digital literacies methodology that is particularly attentive to extant pedagogical and assessment methods of classroom teachers at each research site. It builds upon what is already happening in the classroom as a means for best understanding how particular groups of students will be able to demonstrate their understanding. And it emphasizes assessment for and as learning, and differentiated assessment, over summative assessment. Mirroring ideal classroom practices for assessing students’ digital literacies learning will likely mean employing a whole host of authentic assessment techniques, while acknowledging that, given the gap in research around how best to assess students’ digital literacies learning in the classroom, more work still needs to be done here as classroom practice informs research praxis. Following from Cristyne’s call for methods that capture complexity and that privilege the authentic perspectives of teachers and students in action, Pamela Beach presents two methods that enable researchers to understand the complexities of sense-making with multimodal digital information sources.

Methods for Understanding Online Teacher Professional Learning

From social media sites like Pinterest to professional development websites like Reading Rockets, teachers must decipher, critically evaluate, and synthesize a range of digital texts and media published by a variety of sources (Coiro, 2011; Learning Forward, 2017). Analytical skills are essential for teachers to deliver accurate content to their students and to be productive professionals in a demanding field. In this section, I, Pamela, discuss two methods that can be used to capture teachers’ in-the-moment digital literacies practices, particularly during informal and self-directed online learning. My work centres on the dissemination of research-informed literacy practices, and how and why teachers use online environments for their professional learning (see Beach, Kirby, McDonald, & McConnel, 2019; Beach, Henderson, & McConnel, 2019; Beach & Willows, 2014, 2017). Generating this type of data can provide important feedback to website developers, professional development administrators, and teacher educators and provide insight into real-time adaptations of teaching practice.

Interviews and surveys have often been used to investigate teacher learning. These methods can offer information about how teachers view their learning, including their learning in online environments. Self-reported measures, however, are limited to participants’ recollection of past events and by social desirability of responses. Additionally, self-reported measures do not necessarily capture moment-to-moment variations in learning (Alemdag & Cagiltay, 2018). In my own work, I have leveraged think aloud and eye tracking methods to document ongoing learning patterns and processes (Beach, Kirby, McDonald, & McConnel, 2019; Desjarlais, 2017).

Background Information

The think aloud stems from introspection analysis, a form of data collection aimed at investigating psychological claims and theories of mind (Boren & Ramey, 2000; Ericsson, 2002). The think aloud became an approach that researchers used to generate data on thinking during cognitive tasks (Ericsson, 2003). Cognitive processes underlying decisions and behaviours are usually “hidden from direct observation” (Gaissmaier, et al., 2010, p. 141). The think aloud method, however, can provide direct data about the ongoing cognitive processes and practices that occur during a task performance (Jaspers, 2009).

Over the past several decades, variations of the think aloud have been used in educational research, including the concurrent and retrospective procedures. During the concurrent think aloud, participants complete a task while simultaneously verbalizing their thoughts. For the retrospective think aloud, verbalizations are made after a task has been completed. Although these think aloud methods provide data about the thinking process, studies have found that both the concurrent and retrospective procedures have serious flaws (Beach & Willows, 2017; McDonald et al., 2012; van Gog et al., 2009). The high demand on participants’ cognitive load during the concurrent think aloud often results in surface level verbalizations produced by the participants (McDonald et al., 2012); the cognitive load on working memory can diminish the quality of participants’ verbalizations. Given that the retrospective condition asks participants to verbalize their thoughts without any memory aids, large portions of their decision-making processes are often omitted (Beach & Willows, 2017; Branch, 2006). Variations of the third think aloud, including the virtual revisit think aloud, have the potential to avoid the limitations of the concurrent and retrospective procedures.

The goal of the virtual revisit think aloud is to allow participants to verbalize their thoughts about their literacies practices and the strategies they use during literacies events by using a screen-capture recording of participants’ navigational experiences (Beach & Willows, 2014). Similar to cued retrospective reporting where participants are given instructions to think back using a record of observations (van Gog et al., 2005), the virtual revisit think aloud combines a retrospective think aloud with screen capture technology. As participants view a recording of their navigational experiences, they think aloud about their literacies practices, strategies, and reasons for their decisions.

Along with the think aloud, eye tracking methodology can provide insight into how information is processed during learning. Eye tracking is based on the assumption that there is a correlation between how long something is fixated and how long it is processed (Just & Carpenter, 1980). The argument is that visual attention and cognitive processing occur almost simultaneously so that information is perceived and processed at a cognitive level (Scheiter & Eitel, 2017). Scan paths and fixations are most often used to determine learners’ sequences of attention, providing a window into how learners might approach a task and the patterns of their learning.

The number of studies that have used eye tracking to study digital literacies practices and online learning across a range of ages and participants is relatively low (Alemdag & Cagiltay, 2018; Salmerón et al., 2017). Yet, eye tracking technology has long enabled researchers to make inferences about how learners process information in different formats (Alemdag & Cagiltay, 2018). Employing eye tracking to document the moment-to-moment processes that occur during learning experiences can generate information about learning processes that might not otherwise be articulated by participants.

Case Examples

Two studies are shared below highlighting the virtual revisit think aloud and eye tracking as methods for understanding the complexity of teachers’ in-the-moment digital literacies practices and online learning processes.

          Case 1: Using the Virtual Revisit Think Aloud for Understanding Self-Directed Online Learning. A recent study that used the virtual revisit think aloud as the main data source investigated the types of cognitive processes and strategies used by experienced teachers as they engaged in a self-directed online learning (SDOL) task (Beach, Henderson, & McConnel, 2019). Specifically, the study examined how elementary teachers plan, monitor, and evaluate their learning during SDOL. During one-on-one sessions, thirteen participants were asked to use an online database as they would normally do when seeking out information related to their professional practice. The homepage of the database appeared on the screen at the start of participants’ navigation. For 20 minutes, participants explored the database, its resources, as well as external links and webpages without any prompts or discussion; they perused the sites at their own pace, making selections based on their interest and on what they found meaningful to their teaching practice. At the end of the 20 minutes, participants were given the following instructions:

I would like to invite you to view a screen-recording of your navigation. While you view the screen-recording I would like to ask you to think aloud. What I mean by think aloud is that I want you to tell me everything that you were thinking from the time you began exploring the database until the end of your exploration. I would like you to talk aloud constantly. I don’t want you to try to plan out what you say or try to explain to me what you are saying. Just act as if you are alone in the room speaking to yourself. It is most important that you keep talking. (Beach, Henderson, & McConnel, 2019)

A general inductive approach to analysis resulted in six themes related to the types of cognitive processes and metacognitive strategies the participants used during the SDOL task: connecting to practice, tweaking and adapting, narrowing the focus, skimming through, reading for depth, and source credibility. The complexity of the cognitive and metacognitive processes at play during the instance of SDOL explored in this study indicates a depth that goes beyond a relatively simple, straightforward process. For the teachers in this study, instances of SDOL and digital literacies practices revealed an interconnected, iterative process of overlapping and complementary strategies. The strategies related to planning, monitoring, and evaluating emerged not in a linear pattern, but in an iterative process with participants shifting their focus between the three navigational orientations as they worked toward learning about materials that they deemed useful, credible, and appropriate for their unique classroom contexts.

All successful readers, whether reading online or off, are metacognitive (Afflerbach et al., 2013). Successful readers actively self-monitor their learning and are aware of their thinking and learning processes. They develop an understanding of what they need in order to reach their learning goals (Afflerbach et al., 2013; Flavell, 1979); they take conscious control of their actions and assume responsibility for their learning. This has direct implications for teacher learning as they engage in digital literacies practices. Teachers select professional learning opportunities and material that are usually subject-specific and align with their professional goals. This selection process also tends to shift according to the dynamic nature of the classroom and student needs. Given that in Canada the vast majority of teachers (>90%) engage in various forms of professional learning (Canadian Teachers’ Federation, 2014; Campbell et al., 2017), monitoring the use of particular types of online resources is “challenging, yet essential” (Learning Forward, 2017). The virtual revisit think aloud can provide teachers with an authentic space to voice their learning processes and literacies practices.

          Case 2: Capturing Patterns of Behaviour Using Eye Tracking. Eye tracking was used in a recent exploratory study to investigate the patterns of visual behaviour of experienced elementary teachers and pre-service teachers in an initial teacher education program while they studied a visual model showing key concepts in reading development and instruction called The Reading Pyramid (Beach, Kirby, McDonald, & McConnel, 2019). Similar to The Cognitive Foundations of Learning to Read Framework (Wren, 2000), The Reading Pyramid illustrates the building blocks of reading by organizing these components into two main groups—print-related skills (those that promote the ability to recognize words) and language-related skills (those that support the ability to make meaning of text).

Seven experienced teachers and 11 pre-service teachers participated. Visual attention, prior knowledge, and post-task scores were analyzed using quantitative and qualitative methods. Statistically significant differences between the two groups were found with respect to fixations, scan paths, and pre- and post-task scores (p<.05). The experienced teachers showed more complex eye movement patterns, transitioning between the visual image of the pyramid and the accompanying text more often than the pre-service teachers. Experienced teachers also showed a pattern of visual behaviour in which their eyes moved between information in the pyramid that directly corresponded with related keywords in the text. As noted in the corresponding article by Beach, Kirby, McDonald, and McConnel (2019), it is possible that experienced teachers’ prior knowledge contributed to a stronger connection between the information in the pyramid and corresponding text than the pre-service teachers, and to a more integrative visual pattern of behaviour. Participants with higher levels of prior knowledge about reading development and more teaching experience may have had a greater interest in the presented material and thus, showed a deeper level of information processing.

Implications

Eye tracking and think aloud methods each provide unique datasets related to learners’ digital literacies practices including their behavioural patterns and cognition: Eye tracking shows automatic processing that may escape conscious awareness and think alouds, which offers insight into learners’ decision-making strategies that may not involve eye movements. As teachers, and especially novice teachers, turn to online and digital resources for their professional learning, it is essential to examine the behavioural patterns, thought processes, and feasibility of their learning. Understanding the underlying processes at play has the potential to facilitate the design of effective training and learning platforms that promote more efficient visual search patterns. Perhaps more importantly, teachers who engage in these types of methods may gain deeper understandings of their own strategies for meaning making with multimodal digital texts. In the studies described above and others that I have conducted, participants have often expressed how the experience of the virtual revisit think aloud, in particular, created a self-awareness of their literacies practices and the strategies they privilege during digital literacies events; through these types of reflective methodologies, teachers can become mindful of their own learning which can inform their pedagogical decision making. Moreover, these methods can inform the design of accessible online resources that are conducive to teacher learning which, in turn, contribute to improvements for students.

In the next section, Megan Cotnam-Kappel and Michelle Hagerman offer an overview of the ways that wearable technology can enable researchers to capture the movements, meaning-making practices, and insights of children as they conduct online inquiries (Coiro et al., 2019) and craft physical objects at school.

Maker Methodologies and “Spy” Methods: Innovations and Puzzles

In today’s digitally networked world, we must design research methods that are theory-driven and as dynamic as the social challenges we aim to study. In a collaborative four-year study focused on understanding children’s literacies practices while making (Halverson & Sheridan, 2014; Wohlwend et al., 2017), we, Megan and Michelle, adopted a design-based research methodology (The Design-Based Research Collective, 2003; Wang & Hannafin, 2005) that enabled our data collection and analyses to evolve with students’ maker projects, and teachers’ pedagogical decision making (see Hagerman & Cotnam-Kappel, 2019; Hagerman, Cotnam-Kappel, Turner, & Hughes, 2019; Hartman, Hagerman, & Leu, 2018). To capture the dynamic complexities of children’s digital literacies practices in action, we invited students to wear Diggro “spy glasses”, also known as “point of view video glasses” (Jaldemark et al., 2019; Metcalfe et al. 2015), as they worked through phases of a digital-physical maker project. Unlike eye-tracking technology that has traditionally required participants to read at a fixed computer station outfitted with specialised software in a laboratory setting (e.g., Lévesque et al., 2014; Mason et al., 2013), spy glasses can be worn by students to capture the lived hullabaloo of activity in classrooms, and at a fraction of the cost of similarly “mobile” eye-tracking glasses (e.g., Tobii.com).

For students, this project involved (a) explorations of ideas and plan-making through cycles of online inquiry; (b) instrument making with recycled materials; and (c) multimodal composition and synthesis of their making processes (cf., Hagerman, Cotnam-Kappel, Turner, & Hughes, 2019). The spy glasses used in our study have a wide-angle video camera located in the bridge of the plastic frames that can capture up to 60 minutes of video and audio at the touch of a small button. A USB cable enables the charging of the device and the downloading of recordings onto a computer. Here, we explore the emerging trend of wearable technology for data collection in education and examine the practical and theoretical implications of using spy glasses in the classroom.

Data Collection with Wearable Technologies

Wearable technologies such as spy glasses are part of a larger innovative “shift from computers as detached tools to technologies as embodied companions that become extensions of self” (Bower & Sturman, 2015, p. 344) and as such, enable the collection of data of embodied learning within classroom contexts (Hagerman & Cotnam-Kappel, 2019). Yet, the use of spy glasses as wearable data collection tools is an underreported practice (Jaldemark et al., 2019). This novel data collection approach allowed us to capture students’ classroom activities from their perspective as they moved through cycles of online reading and research (Coiro et al., 2016), multimodal writing (Honeyford, 2014; Smith, 2014), and physical making activities (Clapp et al., 2016; Martin, 2015). Importantly the spy glasses videos recorded students’ gaze during in-the-moment conversations with their teacher, peers, and research team members. These recordings included all of the activities and literacies practices that, when analyzed for trends, frequencies, and themes, can be compared across and within cases over time. For example, we identified clusters of strategic action taken before, during, and after online inquiry, and in ways that extend current conceptions of grade five students’ independent and collaborative meaning-making practices during online reading and research (Coiro et al., 2019; Cho & Afflerbach, 2017) and through phases of making (Clapp et al., 2016). Despite these methodological affordances, the use of spy glasses did raise ethical and methodological questions relating to privacy, data management, technological limitations, and data analysis.

Case Example: “Spy” Methods in the Classroom

Collecting rich point-of-view data from students enabled us to capture socially-situated literacies practices in action. In developing cases, our analyses were strengthened by video from the student’s point-of-view (POV) as well as other classmates’ POV. We saw students from different perspectives and angles (albeit, at times unstable ones) engaging in different conversations. The combination of perspectives of data also allowed us to observe students’ sensory, embodied meaning making in ways that we had not anticipated, but that became impossible to ignore (Hagerman & Cotnam-Kappel, 2019). In this way, the spy glasses opened important theoretical considerations that have extended our understanding of the ways that children construct meanings with texts, with tools, using multiple languages, and in collaboration with one another at school.

Data management proved to be a significant challenge, however. Gaining this in-depth view of children’s learning requires the downloading, organizing, and analysing of many hours of video. With 20 students in a classroom, all wearing spy glasses, one hour of learning becomes 20 hours of data to analyse. These videos need to be transferred manually, via USB cable, and subsequently the devices need to be charged for the following recording session, which is an additional workload that needs to be considered before engaging in this method. The data management workload forced us to consider which moments in learning were most important to capture with spy glasses, and which moments could be documented using other methods—in field notes, or with photographs, for example.

Importantly, this wearable data collection method positioned our student participants as agentive actors in their digital-physical making and literacies learning (Christensen & James, 2017), enabling us (the researchers) to collect data from their perspective opt in or out of gathering data: children have the choice to turn their cameras on and off at any time, to ask questions while filming, to take a stroll around the classroom to help collect data, and to share their experience with us while they record. We also showed examples of clips of data recorded during this exploration time, so that students would have an idea of what the glasses capture while recording. While we consider this exploration time to be essential, it can certainly be seen as a challenge for design-based researchers who already negotiate the scheduling of very precious classroom time with teaching colleagues.

In addition, the exploring time and our subsequent data collection sessions surfaced numerous ethical tensions regarding children’s privacy, most notably given that we were not able to make sure that the few students who did not consent were not visible to others in their recordings. While data regarding these students was not analysed, and exchanges that focused on these students were deleted, we were not able to control this element of the data collection. Indeed, the use of wearables for data collection necessarily meant relinquishing control. Similarly, given that turning the camera on and off was as easy as the touch of a small button on the side of the frames, some children accidentally turned off a camera when they meant to record, so one can imagine that others may have recorded footage when they meant to turn their camera off. Here, we consider the importance of asking children to review materials collected before analysing and disseminating this data, in a method similar to asking interview participants to review their transcription protocol, but we did not anticipate this, nor did we have the opportunity or time with each student in the course of this data collection to do so. While we believe this would contribute to our research, we also anticipate that this co-analysing activity could represent a meaningful metacognitive learning opportunity for students to reflect on their own experiences and learning (Bower & Sturman, 2015) and as something of an innovation on the virtual revisit think aloud method used by Pamela in her work.

Implications

The use of spy glasses as a method of wearable data collection is an undoubtedly complex and time-consuming practice that, in our experience, leads to rich and complex data of in-the-moment learning from both the learner and their peers’ perspectives. The gathering of first-person gaze recordings of students’ digital literacies practices, including online reading, writing, and participating (Lankshear & Knobel, 2008; Leu et al., 2019), as well as both digital and physical making (Hagerman, Cotnam-Kappel, Turner, & Hughes, 2019) provided rich contextual, in-the-moment activity that would have been missed by video recordings fixed to an outsider/observer perspective, or constrained to a screen via screen-capture. We also wish to emphasize the considerable learning opportunities that the use of POV wearable tools present (Bower & Sturman, 2015; Metcalfe et al., 2015) and recommend that future research explore the intersections of data collection, teaching, learning, and (self-)assessment. The use of spy glasses can empower children within the research process, as researchers must relinquish control of aspects of data collection, which minimize power relations and position children as “co-producing agents of results” (Jaldemark et al., 2019, p. 1302). We also believe that further reflection is required to explore the ethical ramifications of using wearables in research settings, particularly when a study involves children. We advocate for creating space for youth voice and meaningful consultation with participants at all stages of the research process, before, during, and after data are collected and disseminated. This will allow researchers to consider participants’ questions, experiences and evolving consent or privacy needs.

Discussion

Although our work has been informed by methods, theories, and practices designed, tested, and refined in diverse educational contexts globally, the need for research methods that can generate urgently needed understandings of digital literacies teaching and learning practices across complex contexts of Canadian schooling has never been greater or more important. As we put the finishing touches on this manuscript, COVID-19 has changed the landscape of schooling in this country, perhaps forever. After a period of emergency distance learning, provinces and territories are currently planning for new models of instruction that will ensure continuity of connection and also maintain safe physical distancing in school communities. Inevitably, online learning will continue to factor into these models, and at a scale never before realized in this country. At the heart of these conversations, however, significant questions remain about the ways that teachers and their students make meanings from digital texts, for diverse purposes, using a variety of sense-making practices, and in digital and physical learning environments. More than ever, we need digital literacies research that is situated in these learning spaces and that continues to build on the fundamental insights of previous work.

As a group of digital literacies researchers (The Digital Literacies Collective) grappling with the significant challenges of this moment, we offer three essential methodological considerations that, in our view, are likely to generate much-needed, transformational insights of value for teachers, students, communities, and policy makers in Canada today.

First, whether online or offline, we assert that new understandings of digital literacies teaching and learning practices will come from research designs and research methods that call attention to the complexities of literacies activities in-the-moment. For us, methods that capture processes not only align with current conceptions and practices of assessment in systems of Canadian schooling today, but they will also enable us to develop deeper understandings of the phenomena we seek to understand. As teachers critically analyse information and synthesize understandings from multiple sources, or when students search for information, the virtual revisit think aloud using screen capture, eye-tracking, or spy glasses video data could open new perspectives on the social, contextual, technological and pedagogical predictors of digital literacies practices, and how they develop. Methods that allow us to linger in process, and in moments of sustained focus, will enable us, as a research community, to generate new and fundamental understandings of how, when, and why meanings are constructed with digital texts and tools and in a range of activities.

Second, our research methods must embrace, rather than seek to control for, the messiness of human-centred sense-making with digital texts and tools in contexts of Canadian schooling. Methods of data collection and analysis must allow for what Cristyne calls generative complexity, becoming fluid methodological spaces that not only push back against the constraints of extant methodological frameworks, but also allow for complexity and surprise (Koro-Ljungberg, 2016). In this way too, the practices we observe are never isolated from the places, cultures, languages, and communities in which our teachers and students live and work. Consistent with our focus on process, the literacies products that we analyse and that we take to represent evidence of digital literacies learning or understanding must also be interpreted as placed-based assemblages (cf., Nichols & Stornaiuolo, 2019) and as evidence of the lived entanglements of Canadian classrooms.

Third, our research methods must be authentic and ethically reciprocal. This means that the texts, the activities, and the guiding frameworks included in our digital literacies research should be co-determined with partners in ways that reflect the meanings and practices of importance for them. Our research designs and methods must attend to the values, needs, and orientations of the communities in which we work; we must leverage methods that are adaptive, dynamic, and flexible rather than prescriptive or pre-determined. In this way, the field of digital literacies research in Canada can begin to build place-based insights that account for who people are, how they teach, and how they learn to be digitally literate.

At a time when there is an urgent need for recommendations on how to teach particular digital literacies skills, research methods that require a great deal of time and effort may seem misguided. Some might say that quick, extractive methods that help us to identify needs would mobilize a broader foundation of understanding in much less time. And yet, without methods that bring us into critical moments of sense-making, we risk surface-level insights that are disconnected from the contexts in which learning happens. By going slow, we can build a stronger and more flexible foundation for digital literacies teaching and policies in Canadian systems of schooling.

Conclusion

In Canada, the field of digital literacies research is at an important crossroads. To construct deeper and more authentic understandings of the meaning-making processes, practices, and activities that enable teachers and students to become digitally literate, we assert that our research methods must embrace the complexities of Canadian contexts of schooling. In particular, we argue for methods that enable in-the-moment insights, embrace methodological and contextual messiness, and that prioritise authentic, ethical reciprocity in their conceptualisation and use. A shared commitment to such depth will enable us to construct a stronger and more flexible foundation on which to design new digital literacies policies and instructional practices that serve Canadian students.

References

Aagaard, T., & Lund, A. (2013). Mind the gap: Divergent objects of assessment in technology-rich learning environments. Nordic Journal of Digital Literacy, 8(4), 225–243. https://www.idunn.no/dk/2013/04/mind_the_gap_divergent_objects_of_assessment_in_technology

Afflerbach, P., Cho, B.-Y., Kim, J.-Y., Crassas, M. E., & Doyle, B. (2013). Reading: What else matters besides strategies and skills? The Reading Teacher, 66(6), 440–448. https://doi.org/10.1002/TRTR.1146

Alemdag, E., & Cagiltay, K. (2018). A systematic review of eye tracking research on multimedia learning. Computers & Education, 125, 413–428. https://doi.org/10.1016/j.compedu.2018.06.023

Baird, J., Andrich, D., Hopfenbeck, T. N., & Stobart, G. (2017). Assessment and learning: Fields apart? Assessment in Education: Principles, Policy & Practice, 24(3), 317–350. https://doi.org/10.1080/0969594X.2017.1319337

Beach, P., Henderson, G., & McConnel, J. (2019, June 1–5). Examining elementary teachers’ learning experiences as they use the Canadian Financial Literacy Database [Conference presentation]. Canadian Society for the Study of Education (CSSE) Annual Conference, Vancouver, BC. https://csse-scee.ca/conference-2019/"

Beach, P., Kirby, J., McDonald, P., & McConnel, J. (2019). How do elementary teachers study and learn from a multimedia model of reading development? An exploratory eye-tracking study. Canadian Journal of Education, 42(4), 1022–1058. https://journals.sfu.ca/cje/index.php/cje-rce/article/view/3919

Beach, P., & Willows, D. (2014). Investigating teachers’ exploration of a professional development website: An innovative approach to understanding the factors that motivate teachers to use Internet-based resources. Canadian Journal of Learning and Technology, 40(3). http://dx.doi.org/10.21432/T2RP47

Beach, P., & Willows, D. (2017). Understanding teachers’ cognitive processes during online professional learning: A methodological comparison. Online Learning, 21(1), 60–84. http://dx.doi.org/10.24059/olj.v21i1.949

Bell, S. (2010). Project-based learning for the 21st century: Skills for the future. The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 83(2), 39–43. https://doi.org/10.1080/00098650903505415

Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice,18(1), 5–25. https://doi.org/10.1080/0969594X.2010.513678

Bergstom, K., Jenson, J., Flynn-Jones, E., & Hébert, C., (2018). Videogame walkthroughs in educational settings: Challenges, successes, and suggestions for future use. Proceedings of the 51st Hawaii International Conference on System Sciences. https://doi.org/10.24251/HICSS.2018.237

Bhatt, I., de Roock, R., & Adams, J. (2015). Diving deep into digital literacy: emerging methods for research. Language and Education, 29(6), 477–492. https://doi.org/10.1080/09500782.2015.1041972

Black, P., McCormick, R., James, M., & Pedder, D. (2006). Learning how to learn and assessment for learning: A theoretical inquiry. Research Papers in Education, 21(2), 119–132. https://doi.org/10.1080/02671520600615612

Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21(1), 5–31. https://doi.org/10.1007/s11092-008-9068-5

Boren, M. T., & Ramey, J. (2000). Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3), 261–278. https://doi.org/10.1109/47.867942

Bower, M., & Sturman, D. (2015). What are the educational affordances of wearable technologies? Computers & Education, 88, 343–353. https://doi.org/10.1016/j.compedu.2015.07.013

Branch, J. (2006). Using think alouds, think afters, and think togethers to research adolescents’ inquiry experiences. Alberta Journal of Educational Research, 52(3), 148-159. https://journalhosting.ucalgary.ca/index.php/ajer/article/view/5515

British Columbia Ministry of Education. (2013). Core competencies. https://curriculum.gov.bc.ca/competencies

Brownell, C. J., & Wargo, J. M. (2017). (Re)educating the senses to multicultural communities: prospective teachers using digital media and sonic cartography to listen for culture. Multicultural Education Review, 9(3), 201–214. https://doi.org/10.1080/2005615X.2017.1346559

Burke, A., & Rowsell, J. (2007). Assessing multimodal learning practices. E-Learning, 4(3), 329–342. https://doi.org/10.2304/elea.2007.4.3.329

Campbell, C., Osmond-Johnson, P., Faubert, B., Zeichner, K., Hobbs-Johnson A., Brown, S., DaCosta, P., Hales, A., Kuehn, L., Sohn, J., & Steffensen, K. (2016). The state of educators’ professional learning in Canada. Learning Forward. https://learningforward.org/wp-content/uploads/2017/08/state-of-educators-professional-learning-in-canada-executive-summary.pdf

Canadian Teachers’ Federation. (2014). Highlights of CTF survey on the quest for teacher work-life balance. https://www.ctf-fce.ca/Research-Library/Work-Life-Balance-Survey-DW-CAPTO.pdf

Carless, D. (2019). Feedback loops and the longer-term: towards feedback spirals. Assessment and Evaluation in Higher Education, 44(5), 705–714. https://doi.org/10.1080/02602938.2018.1531108

Carless, D., & Boud, D. (2018). The development of student feedback literacy: Enabling uptake of feedback. Assessment and Evaluation in Higher Education, 43(8), 1315–1325. https://doi.org/10.1080/02602938.2018.1463354

Cho, B.-Y., & Afflerbach, P. (2017). An evolving perspective of constructively responsive reading comprehension strategies in multilayered digital text environments. In S. E. Israel (Ed.), Handbook of Research on Reading Comprehension (2nd ed., pp. 109–134). Guilford Press.

Christensen, P., & James, A. (2017). Introduction: Researching children and childhood: Cultures of Communication. In P. Christensen & A. James (Eds.), Research with children: Perspectives and practices (3rd ed., pp. 1–10). Routledge.

Clapp, E., Ross, J. O., Ryan, J., & Tishman, S. (2016). Maker-Centered learning: Empowering young people to shape their worlds. Jossey-Bass.

Coiro, J. (2011). Talking about reading as thinking: Modeling the hidden complexities of online reading comprehension. Theory into Practice, 50(2), 107–115. https://doi.org/10.1080/00405841.2011.558435

Coiro, J. (2020). Toward a multifaceted heuristic of digital reading to inform assessment, research, practice, and policy. Reading Research Quarterly, 1–23. https://doi.org/10.1002/rrq.302

Coiro, J., Castek, J., & Quinn, D. (2016). Personal inquiry and online research: Connecting learners in ways that matter. The Reading Teacher, 69(5), 483–492. https://doi.org/10.1002/trtr.1450

Coiro, J., Dobler, E., & Pelekis, K. (2019). From curiosity to deep learning: Personal digital inquiry in grades K-5. Stenhouse Publishers.

The Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

Desjarlais, M. (2017). The use of eye-gaze to understand multimedia learning. In C. Was, F. Sansosti, & B. Morris (Eds.), Eye-tracking technology applications in educational research (pp. 122–142). IGI Global.

Duke, N. K., & Mallette, M. H. (2001). Critical issues: Preparation for new literacy researchers in multi-epistemological, multi-methodological times. Journal of Literacy Research, 33(2), 345–360. https://doi.org/10.1080/10862960109548114

Egenfeldt-Nielson, S., Smith, J. H., & Tosca, S. P. (2016). Understanding video games: The essential information (3rd ed.). Routledge.

Ericsson, K. (2002). Towards a procedure for eliciting verbal expression of non-verbal experience without reactivity: Interpreting the verbal overshadowing effect within the theoretical framework for protocol analysis. Applied Cognitive Psychology, 16, 981–987. https://doi.org/10.1002/acp.925

Ericsson, K. (2003). Valid and non-reactive verbalization of thoughts during performance of tasks: Towards a solution to the central problems of introspection as a source of scientific data. Journal of Consciousness Studies, 10(9–10), 1–18.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34(10), 906–911. https://doi.org/10.1037/0003-066X.34.10.906

Fong, K., Jenson, J., & Hébert, C., (2018). Challenges with measuring learning through digital gameplay in K-12 classrooms. Media and Communication, 6(2). https://doi.org/10.17645/mac.v6i2.1366

Gaissmaier, W., Fifić, M., & Rieskamp, J. (2010). Analyzing response times to understand decision processes. In M. Schulte-Mecklenbeck, A. Kuhberger, & R. Ranyard (Eds.), The handbook of process tracing methods for decision research: A critical review and user’s guide (pp. 89–114). Psychology Press.

Guba, E. G., & Lincoln, Y. S. (1989). Fourth generation evaluation. Sage.

Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A five-dimensional framework for authentic assessment. Educational Technology Research and Development, 52(3), 67–86. https://doi.org/10.1007/BF02504676

Hagerman, M. S., & Cotnam-Kappel, M. (2019). Making as embodied learning: Rethinking the importance of movement for learning with digital and physical tools. Education Review, 6(2), 1–3. https://education.uottawa.ca/sites/education.uottawa.ca/files/uo_fefe_re_fall_06_acc_02_0.pdf

Hagerman, M. S., Cotnam-Kappel, M., Turner, J.-A., & Hughes, J. M. (2019, April 8). Layers of online reading, research and multimodal synthesis practices while making: A descriptive study of three fifth-grade students [Roundtable paper presentation]. American Educational Research Association Annual Meeting, Toronto, ON, Canada. http://mschirahagerman.com/wp-content/uploads/2019/04/AERA_2019_fairemaker.pdf

Halverson, E. R., & Sheridan, K. (2014). The maker movement in education. Harvard Educational Review, 84(4), 495–505. https://doi.org/10.17763/haer.84.4.34j1g68140382063

Hansen, D., & Imse, L. A. (2016). Student-centered classrooms. Music Educators Journal, 103(2), 20–26. https://doi.org/10.1177/0027432116671785

Hartman, D. K., Hagerman, M. S., & Leu, D. J. (2018). Towards a new literacies perspective of synthesis. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.), Handbook of multiple source use (pp. 55–78). Routledge.

Hartman, D. K., Morsink, P. M., & Zheng, J. J. (2010). From print to pixels: The evolution of cognitive conceptions of reading comprehension. In E. A. Baker (Ed.), The new literacies: Multiple perspectives and practice (pp. 131–164). Guilford Press.

Hébert, C., & Jenson, J. (2019). Digital game-based pedagogies: Developing teaching strategies for game-based learning. The Journal of Interactive Technology & Pedagogy, 15. https://jitp.commons.gc.cuny.edu/digital-game-based-pedagogies-developing-teaching-strategies-for-game-based-learning/"

Hébert, C., & Jenson, J. (2020). Making in schools: Student learning through an e-textiles curriculum. Discourse: Studies in the Cultural Politics of Education, 41(5), 740–761. https://doi.org/10.1080/01596306.2020.1769937

Honeyford, M. A. (2014). From Aquí and Allá: Symbolic convergence in the multimodal literacy practices of adolescent immigrant students. Journal of Literacy Research, 46(2), 194–233. https://doi.org/10.1177/1086296X14534180

Hughes, J. M. (2017). Digital making with “at-risk” youth. International Journal of Information and Learning Technology, 34(2), 102–113. https://doi.org/10.1108/IJILT-08-2016-0037

Jaldemark, J., Bergström-Eriksson, S., von Zeipel, H., & Westman, A. K. (2019). Wearable technologies as a research tool for studying learning. In Y. A. Zhang, & D. Cristol (Eds.), Handbook of mobile teaching and learning (pp. 1291–1305). Springer.

Jaspers, M. W. (2009). A comparison of usability methods for testing interactive health technologies: Methodological aspects and empirical evidence. International Journal of Medical Informatics, 78(5), 340–353. https://doi.org/10.1016/j.ijmedinf.2008.10.002

Jewitt, C. (2003). Re-thinking assessment: Multimodality, literacy and computer-mediated learning. Assessment in Education: Principles, Policy and Practice, 10(1), 83–102. https://doi.org/10.1080/09695940301698

Jones Miller, J. (2013). A better grading system: Standards-based, student-centered assessment. English Journal, 103(1), 111–118.

Just, M. A., & Carpenter, P. A. (1980). A theory of reading: From eye fixations to comprehension. Psychological Review, 87(4), 329–354. https://doi.apa.org/doi/10.1037/0033-295X.87.4.329

Knobel, M., & Lankshear, C. (2017). Researching new literacies: Design, theory, and data in sociocultural investigation. Peter Lang.

Koro-Ljungberg, M. (2016). Reconceptualizing qualitative research: Methodologies without methodology. Sage Publishers.

Kress, G. R. (2003). Literacy in the new media age. Psychology Press.

Lankshear, C. J., & Knobel, M. (2008). Digital literacies: Concepts, policies and practices. Peter Lang.

Learning Forward. (2017). Standards for professional learning. https://learningforward.org/standards/resources

Leu, D., Kinzer, C., Coiro, J., Castek, J., & Henry, L. (2019). New literacies: A dual-level theory of the changing nature of literacy, instruction, and assessment. In D. Alvermann, Unrau, N., M. Sailors & R. Ruddel (Eds.), Theoretical models and processes of literacy (7th ed., pp. 319–346). Routledge.

Lévesque, S., Ng-A-Fook, N., & Corrigan, J. (2014). What does the eye see? Reading online primary source photographs in history. Contemporary Issues in Technology and Teacher Education, 14(2), 101–140. https://citejournal.org/volume-14/issue-2-14/social-studies/what-does-the-eye-see-reading-online-primary-source-photographs-in-history/"

Lotherington, H., & Ronda, N. S. (2012). Multimodal literacies and assessment: Uncharted challenges in the English classroom. In C. Leung & B. Street (Eds.), English as a changing medium for education (pp. 104–128). UTP.

Lund, A. (2008). Assessment made visible: Individual and collective practices. Mind, Culture, and Activity, 15(1), 32–51. https://doi.org/10.1080/10749030701798623

Marshall, S., & Marr, J. W. (2018). Teaching multilingual learners in Canadian writing-intensive classrooms: Pedagogy, binaries, and conflicting identities. Journal of Second Language Writing, 40, 32–43. https://doi.org/10.1016/j.jslw.2018.01.002

Martin, L. (2015). The promise of the maker movement for education. Journal of Pre College Engineering Education Research (J-PEER), 5(1), 30–39. https://doi.org/10.7771/2157-9288.1099

Mason, L., Tornatora, M.C., & Pluchino, P. (2013). Do fourth graders integrate text and picture in processing and learning from an illustrated science text? Evidence from eye-movement patterns. Computers & Education, 60(1), 95–109. https://doi.org/10.1016/j.compedu.2012.07.011

Mayer, R. E. (2002). Rote versus meaningful learning. Theory into Practice, 41(4), 226–232. https://doi.org/10.1207/s15430421tip4104

McDonald, S., Edwards, H., & Zhao, T. (2012). Exploring think-alouds in usability testing: An international survey. IEEE Transactions on Professional Communication, 55(1), 2–19. https://doi.org/10.1109/TPC.2011.2182569

McLean, C., & Rowsell, J. (2020). Digital literacies in Canada. In J. Lacina and R. Griffith (Eds.) Preparing globally minded literacy teachers: Knowledge, practices, and case studies (pp. 175–197). Routledge.

Metcalfe, H., Jonas-Dwyer, D., Saunders, R., & Dugmore, H. (2015). Using the technology: Introducing point of view video glasses into the simulated clinical learning environment. Computers, Informatics, Nursing, 33(10), 443–447. https://doi.org/10.1097/CIN.0000000000000168

New Brunswick Ministry of Education. (2020). Provincial assessments. https://www2.gnb.ca/content/dam/gnb/Departments/ed/pdf/K12/eval/AssessmentBrochure.pdf

Newmann, F., Secada, W., & Wehlage, G. (1995). A guide to authentic instruction and assessment: Vision, standards and scoring. Wisconsin Center for Educational Research.

Nichols, T. P., & Stornaiuolo, A. (2019). Assembling “digital literacies”: Contingent pasts, possible futures. Media and Communication, 7(2), 14–24. http://dx.doi.org/10.17645/mac.v7i2.1946

Olusegun, S. (2015). Constructivism learning theory: A paradigm for teaching and learning. IOSR Journal of Research & Method in Education Ver. I, 5(6), 2320–7388. https://doi.org/10.9790/7388-05616670

Ontario Ministry of Education. (2010). Growing success: Assessment, evaluation, and reporting in Ontario schools. http://www.edu.gov.on.ca/eng/policyfunding/growSuccess.pdf

Panadero, E., Jonsson, A., & Strijbos, J. (2016). Scaffolding self-regulated learning through self-assessment and peer assessment: Guidelines for classroom implementation. In D. Laveault & L. Allal (Eds.), Assessment for Learning: Meeting the Challenge of Implementation (pp. 311–326). https://doi.org/10.1007/978-3-319-39211-0

Prinsloo, M., & Krause, L. S. (2019). Translanguaging, place and complexity. Language and Education, 33(2), 159–173. https://doi.org/10.1080/09500782.2018.1516778

Renzulli, J. S., Gentry, M., & Reis, S. M. (2004). A time and a place for authentic high-end learning. Educational Leadership, 62(1), 73–77. http://www.ascd.org/publications/educational-leadership/sept04/vol62/num01/A-Time-and-a-Place-for-Authentic-Learning.aspx

Riegler, A. (2011). Constructivism. In L. L’Abate (Ed.), Paradigms in theory construction (pp. 235–255). Springer.

Roach, K., Tilley, E., & Mitchell, J. (2018). How authentic does authentic learning have to be? Higher Education Pedagogies, 3(1), 495–509. https://doi.org/10.1080/23752696.2018.1462099

Rowsell, J., Saudelli, M. G., Scott, R. M., & Bishop, A. (2013). iPads as placed resources: Forging community in online and offline spaces. Language Arts, 90(5), 351–360. https://library.ncte.org/journals/la/issues/v90-5

Salmerón, L., Naumann, J., García, V., & Fajardo, I. (2017). Scanning and deep processing of information in hypertext: An eye tracking and cued retrospective think‐aloud study. Journal of Computer Assisted Learning, 33(3), 222–233. https://doi.org/10.1111/jcal.12152

Scheiter, K., & Eitel, A. (2017). The use of eye tracking as a research and instructional tool in multimedia learning. In C. Was, F. Sansosti, & B. Morris (Eds.), Eye-tracking technology applications in educational research (pp. 143–165). IGI Global.

Schifter, C. C, & Stewart, C. M. (2010). Technologies and the classroom come to age after century of growth. In C. M. Stewart, C. C. Schifter, & M. E. Markaridian Selvarien (Eds.), Teaching and learning with technology: Beyond constructivism (pp. 3–26). Routledge.

Schon, D. (1983). The reflective practitioner: How professionals think in action. Basic Books.

Schon, D. (1987). Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. Jossey-Bass Inc.

Schulz, R., Schroeder, D., & Brody, C. M. (1997). Collaborative narrative inquiry: Fidelity and the ethics of caring in teacher research. International Journal of Qualitative Studies in Education, 10(4), 473–485. https://doi.org/10.1080/095183997237052

Shannon, P., & Hambacher, E. (2014). Authenticity in constructivist inquiry: Assessing an elusive construct. The Qualitative Report, 19(52), 1–13. http://nsuworks.nova.edu/tqr/vol19/iss52/3

Silseth, K., & Gilje, Ø. (2019). Multimodal composition and assessment: A sociocultural perspective. Assessment in Education: Principles, Policy & Practice, 26(1), 26–42. https://doi.org/10.1080/0969594X.2017.1297292

Smith, B. E. (2014). Beyond words: A review of research on adolescents and multimodal composition. In R. Ferdig, & K. Pytash (Eds.), Exploring multimodal composition and digital writing (pp. 1–19). IGI Global. https://doi.org/10.4018/978-1-4666-4345-1.ch001

Smith, J. (2018). Parasitic and parachute research in global health. The Lancet Global Health, 6(8), e838. https://doi.org/10.1016/S2214-109X(18)30315-2

Smythe, S., & Neufeld, P. (2010). “Podcast time:” Negotiating digital literacies and communities of learning in a middle years ELL classroom. Journal of Adolescent & Adult Literacy, 53(April), 565–574. https://doi.org/10.1598/JA

Spires, H., Bartlett, M. E., Garry, A., & Quick, A. H. (2012). Literacies and learning: Designing a path forward. Friday Institute for Educational Innovation, North Carolina State University. https://www.fi.ncsu.edu/wp-content/uploads/ 2013/05/digital-literacies-and-learning.pdf

Stornaiuolo, A., Higgs, J., & Hull, G. (2014). Social media as authorship: Methods for studying literacies and communities online. In P. Albers, T. Holbrook, & A. S. Flint (Eds.), New Methods of Literacy Research (pp. 224–237). Routledge.

Subban, P. (2006). Differentiated instruction: A research basis. International Education Journal, 7(7), 935–947. https://files.eric.ed.gov/fulltext/EJ854351.pdf

Tan, E., Calabrese Barton, A., Kang, H., & O’Neill, T. (2013). Desiring a career in STEM-related fields: How middle school girls articulate and negotiate identities-in-practice in science. Journal of Research in Science Teaching, 50(10), 1143–1179.

Tomlinson, C. A., & Moon, T. (2013). Assessment and student success in a differentiated classroom. ACSD.

Topping, K. J. (2009). Peer assessment. Theory into Practice, 48(1), 20–27. https://doi.org/10.1080/00405840802577569

Trainor, A., & Bouchard, K. A. (2013). Exploring and developing reciprocity in research design. International Journal of Qualitative Studies in Education, 26(8), 986–1003. https://doi.org/10.1080/09518398.2012.724467

van Gog, T., Kester, L., Nievelstein, F., Giesbers, B., & Paas, F. (2009). Uncovering cognitive processes: Different techniques that can contribute to cognitive load research and instruction. Computers in Human Behavior, 25(2), 325–331. https://doi.org/10.1016/j.chb.2008.12.021

van Gog, T., Paas, F., van Marrienboer, J., & Witte, P. (2005). Uncovering the problem-solving process: Cued retrospective reporting versus concurrent and retrospective reporting. Journal of Experimental Psychology: Applied, 11(4), 237–244. https://doi.org/10.1037/1076-898X.11.4.237

van Kraayenoord, C. E., Honan, E., & Moni, K. B. (2011). Negotiating knowledge in a researcher and teacher collaborative research partnership. Teacher Development, 15(4), 403–420. https://doi.org/10.1080/13664530.2011.635267

Villarroel, V., Bloxham, S., Bruna, D., Bruna, C., & Herrera-Seda, C. (2018). Authentic assessment: creating a blueprint for course design. Assessment and Evaluation in Higher Education, 43(5), 840–854. https://doi.org/10.1080/02602938.2017.1412396

Wang, F., & Hannafin, M. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development, 53(4), 5–23.

Wargo, J. M. (2019). Lights! Cameras! Genders? Interrupting hate through classroom tinkering, digital media production and [q]ulturally sustaining arts-based inquiry. Theory into Practice, 58(1), 18–28. ttps://doi.org/10.1080/00405841.2018.1536919

Watt, D. (2019). Video production in elementary teacher education as a critical digital literacy practice. Media and Communication, 7(2), 82–99. https://doi.org/10.17645/mac.v7i2.1967

Wohlwend, K. E., Peppler, K. A., Keune, A., & Thompson, N. (2017). Making sense and nonsense: Comparing mediated discourse and agential realist approaches to materiality in a preschool makerspace. Journal of Early Childhood Literacy, 17(3), 444–462. https://doi.org/10.1177/1468798417712066

Wren, S. (2000). The cognitive foundations of learning to read: A framework. Southwest Educational Development Laboratory. https://www.sedl.org/reading/framework/framework.pdf


Authors

Dr. Michelle Schira Hagerman is an Assistant Professor of Educational Technologies in the Faculty of Education at the University of Ottawa. She studies digital literacies teaching and learning practices in K-12 schools, and she is particularly interested in maker education as a pathway to digital equity and empowerment for Canadian youth. Email: M.S.Hagerman@uottawa.ca

Dr. Pamela Beach is an Assistant Professor of Language and Literacy in the Faculty of Education at Queen’s University. Her work centers on teacher cognition and self-directed learning and explores how online and multimedia resources can be used in teacher education and literacy-oriented professional development. Email: pamela.beach@queensu.ca

Dr. Megan Cotnam-Kappel is a Francophone Assistant Professor of Educational Technologies in the Faculty of Education at the University of Ottawa. Her research agenda is centered on youth and teacher’s voices in minority French-language settings and explores questions relating to digital equity, digital literacies, and what it means to be an empowered digital citizen. Email: mcotnamkappel@uottawa.ca

Dr. Cristyne Hébert is an Assistant Professor of Assessment and Evaluation in the Faculty of Education at the University of Regina. Her research focuses on supporting teachers and students in the development of digital literacies, alongside the adoption of multimodal forms of assessment, in K-12 settings. Email: cristyne.hebert@uregina.ca