A Response to the Review of the Community of Inquiry Framework

Zehra Akyol, J. Ben Arbaugh, Marti Cleveland-Innes, D. Randy Garrison, Phil Ice, Jennifer C. Richardson and Karen Swan

VOL. 23, No. 2, 123 - 136

Abstract

The Community of Inquiry (CoI) framework has become a prominent model of teaching and learning in online and blended learning environments. Considerable research has been conducted which employs the framework with promising results, resulting in wide use to inform the practice of online and blended teaching and learning. For the CoI model to continue to grow and evolve, constructive critiques and debates are extremely beneficial, in so much as they identify potential problems and weaknesses in the model or its application, as well as provide direction for further research. In this context, the CoI framework was recently reviewed and critiqued by Rourke and Kanuka in their JDE article entitled “Learning in Communities of Inquiry: A Review of the Literature.”

This paper is a response to this article and focuses on two main issues. The first issue is the focus of the review and critique on learning outcomes. The second issue concerns the representation, comprehensiveness, and methodology of the review.

Résumé

Le cadre de référence mis de l’avant par The Community of Inquiry (CoI) est devenu un modèle éminent d’enseignement et d’apprentissage dans des milieux d’apprentissage en ligne et mixtes. Une quantité considérable de recherche a été menée en employant le cadre de référence avec des résultats prometteurs, ayant eu pour effet son utilisation étendue pour informer la pratique de l’enseignement et de l’apprentissage en ligne et mixte. Pour que le modèle CoI puisse continuer à prendre de l’expansion et à évoluer, les critiques et les débats constructifs sont extrêmement bénéfiques, en ce qu’ils identifient les problèmes potentiels et les faiblesses liés au modèle ou à son application, et fournissent un enlignement pour la recherche future. Dans ce contexte, le cadre de référence CoI a récemment été examiné et critiqué par Rourke et Kanuka dans leur article RÉD intitulé « Learning in Communities of Inquiry: A Review of the Literature ».

La présente communication est une réponse à cet article et se concentre sur deux questions principales. La première question est celle de la priorité accordée par l’examen et la critique aux résultats d’apprentissage. La seconde question porte sur la représentation, la perspective d’ensemble, et la méthodologie de l’examen.

Learning Processes vs. Learning Outcomes

A focal point of this response is to clarify what Rourke and Kanuka refer to as the “central claim” (p. 20) of the CoI framework which they equate with learning outcomes. This is a serious misrepresentation of the CoI model as it is first and foremost a process model. While the seminal CoI work does not exclude the consideration of intended learning outcomes, the focus has been consistently on the nature of the educational transaction. In this regard, the original article concluded with the expectation that the CoI framework “would be used to assess different educational approaches and strategies in facilitating a community of inquiry …” (Garrison, Anderson & Archer, 2000, p. 103). From a foundational perspective, the CoI framework embraces a constructivist orientation in which the emphasis is on how we construct knowledge as opposed to an objectivist focus on learning outcomes (Jonassen, 1991). This is reflected in the following quotation in Matthew Lipman's (1991) book - Thinking in education:

John Dewey was convinced that education had failed because it was guilty of a stupendous category mistake: It confused the refined, finished end products of inquiry with the raw, crude subject matter of inquiry and tried to get students to learn solutions rather than investigate the problems and engage in inquiry for themselves. (p. 15)

The dynamic nature of the CoI framework is also reflected in the description of it as the “interaction” among the three presences. Moreover, the dynamic nature of the CoI framework is reflected in the developmental and progressive nature of each of the presences. In essence, the framework describes a generic educational experience (see the core of the overlap among presences). Reference to deep and meaningful learning is primarily how to approach the educational transaction from a practical perspective.

The transactional nature of the CoI framework was also emphasized in the book by Garrison and Anderson (2003) that integrated seminal papers and expanded upon its theoretical foundation. Teaching presence was described in terms of “influencing the approach to learning” (Garrison & Anderson, 2003, p. 17) and where the conditions of facilitating deep understanding are outlined. The discussion is about “deep level processing” and approaches to deep and meaningful learning. Ramsden (1988) was the primary reference to a “deep approach to learning” (Garrison & Anderson, 2003, p. 16), not Ausubel or Marton and Saljo, as suggested by Rourke & Kanuka (2008). Ramsden (1992) states in his critique of deep and surface approaches to learning, “You must fix clearly in your mind the concept of approach to learning … [and this] is a key concept in teaching and learning” (p. 39). Ramsden’s work was explored further in a study by Garrison and Cleveland-Innes (2005) in which approaches to learning were measured using the Study Process Questionnaire (Biggs, 1987). The findings provided important insights with regard to deep approaches to learning and concluded that a “deep approach to learning must consider all three elements of the community of inquiry” (p. 144).

Specific to the focus on cognitive presence, we emphasize that cognitive presence, as operationalized through the Practical Inquiry (PI) model, is a process that is consistent with the transactional nature of the CoI framework. With this as background, we find the discussion around cognitive presence puzzling and disconcerting (see p. 39 of the review). First, the charge that it is ambiguous as to whether cognitive presence is a student activity or a prescription is puzzling. It can be both. It describes potential learning activities as well as prescriptions for deep and meaningful learning. It includes understanding an issue or problem; searching for relevant information; connecting and integrating information; and actively confirming the understanding in a collaborative and reflective learning process. In a recent study by Akyol and Garrison (in press), it is hypothesized that the processes described by the PI Model are associated with actual higher-order learning outcomes and proxy measures such as perceived learning. Their research confirmed the strong relationship between processes and outcomes suggesting that collaborative development of cognitive presence in online discussions and students’ perception of cognitive presence are associated with high perceptions of learning and actual learning outcomes in terms of grades.

Secondly, and more importantly, it is disconcerting to read that researchers “have not been able to identify clear instances of cognitive presence” (p. 39). It should be noted that only seven studies are listed and most are not current (Table 2). To this point, there are several studies that have shown students to be “engaged in the constituent processes” (p. 39) (Akyol & Garrison, 2008; Meyer, 2004; Pisutova-Gerber & Malovicova, 2009). Not only does the existence of these studies refute the conclusions of the Rourke and Kanuka review, they also support the explanations offered repeatedly in earlier reported studies concerning why students were not reaching the higher phases of critical thinking and inquiry (Garrison, Anderson & Archer, 2000; Garrison & Arbaugh, 2007; Garrison & Cleveland-Innes, 2005; Garrison & Vaughan, 2008).

Furthermore, it was repeatedly argued in those same studies that the apparent inability to move to the integration and resolution phases very likely had to do with teaching presence issues (design, facilitation and direction). Numerous studies have provided ample evidence to this effect. Several references cited in the Garrison and Arbaugh (2007) article support the contention that it was a design (i.e., the nature of the learning tasks) and a facilitation issue. Work by Ice, Akyol and Garrison (2009) indicates that in some instances this may also be a function of the epistemological orientation of the course’s instructional design and organization components. Other recent studies also support the contention that failure to reach the advanced phases of cognitive presence are most likely due to issues of teaching presence and support the validity of the cognitive presence (i.e., Practical Inquiry) model (Akyol & Garrison, 2008; Bangert, 2008; Pisutova-Gerber & Malovicova, 2009; de Leng et al., 2009; Shea & Bidjerano, 2009). The following statement supports this contention:

In summary, we conclude that Garrison’s ‘Practical Inquiry’ model appears to be a viable instrument for procedural facilitation of online discussions … An e-learning model integrating this ‘Practical Inquiry’ model in concerted facilitation by a human moderator and a program for asynchronous communication appeared to be successful in establishing a dialogue among an expert and a group of students (de Leng et al., 2009).

Finally, in instances where teaching presence was enhanced via the introduction of immediacy enhancing technologies (Ice, Curtis, Phillips & Wells, 2007; Ice, 2008a) or rich internet applications (RIA’s) (Ice, 2008b), students were found to move to integration more frequently than when lower context means of communication and collaboration were utilized. In the case of RIA’s, it is significant to note that the prescription for learning and the activities utilized build upon each other, thus reinforcing our contention that the development of cognitive presence is not an either/or proposition.

Community of Inquiry Research

At the outset of the review, the authors state that their aim was to focus on “the CoI as a program of research” (p. 22). In terms of representation and comprehensiveness, this fails to distinguish research from the CoI framework as a theoretical model. To state that it “fails as a model for achieving deep and meaningful learning” (p. 43) not only distorts the essence of the framework as a process model but does not recognize its theoretical nature and its role in guiding research. From a theoretical perspective, there has been significant evidence attributing to the validity of the framework (Akyol, 2009; Arbaugh, 2007; Garrison, Cleveland-Innes & Fung, 2004; Shea & Bidjerano, 2009; Swan, Shea, Richardson, Ice, Garrison et al., 2008; Arbaugh, Cleveland-Innes, Diaz, Garrison, Ice et al., 2008). Judging by the number of studies that have used the framework as a guide, it should be clear that it has served as a useful theoretical framework.

From a representation perspective, the authors provide a table (Table 1, p. 26) to summarize their review showing “empirical” studies of the CoI framework. However, the studies listed in the table are not all empirical (e.g. Redmond & Lock; Garrison) nor do they all “[take] some element of the CoI as their primary focus” (p. 5) (e.g., Rogers & Lea; Conrad; Murphy; Rovai; Richardson & Swan). In fact, recent research that considers the entire framework suggests that it may be predictive of student perceived learning and satisfaction with online learning (Arbaugh, 2008). An apparent selective search for relevant literature and the classification of the documents set the stage for misleading conclusions. Even considering the articles listed in the review, it would seem to be premature to draw such conclusions based on the limited number of studies, only five of which were focused on cognitive presence.

Another problem with the table is with regard to the measurement of learning. The authors only identify Bloom’s taxonomy, Bigg’s SOLO taxonomy and perceived learning as measures of learning while ignoring the Practical Inquiry model (i.e., cognitive presence). This situation is contradictory to recent studies that have studied the Practical Inquiry (PI) model juxtaposed with other models, especially with Bloom’s taxonomy (Buraphadeja, & Dawson, 2008; Cotton & Yorke, 2006; Meyer, 2004; Schrire, 2004; 2006). Schrire (2004), in her comparison of the PI model, Bloom’s taxonomy and SOLO taxonomy, found that the PI model “to be the most relevant to the analysis of the cognitive dimension and represents a clear picture of the knowledge-building processes occurring in online discussion” (p. 491). Recently, Buraphadeja and Dawson (2008) indicated that the PI model is suitable for assessing critical thinking and that the CoI framework is being continually developed and has been widely cited in the literature.

From a methodological view, we question discounting the use of self-reports to measure students’ learning at this stage of CoI research. Considering the enormous challenge of identifying valid and cost-effective proxy measures of latent higher-order thinking, it is difficult to understand the exclusive focus of this review “with learning outcomes” (p. 25). This narrow focus provided the opportunity to critically judge process and perception variables as being deficient in assessing the CoI and discounting deep and meaningful approaches to learning (Ramsden, 1992). Why would we eliminate any source of data in our attempt to understand communities of inquiry in online and blended learning environments? Do we discard recent research that has used actual grades and perceived learning in online learning research (see Benbunan-Fich & Arbaugh, 2006; Lim, Morris & Kupritz, 2007; Roblyer, Freeman, Donaldson & Maddox, 2007) or work by Ice et al. (2007) in which student perceptions of learning were validated through document analysis?

Self-reports of learning in fact may be warranted to increase the generalizability of CoI research and to increase knowledge of the entirety of the process of developing cognitive presence. Taking empirical research on the CoI to the next level will require larger sample sizes than typically have been the case for previous research (Garrison & Arbaugh, 2007). This likely means sampling multiple courses at once, and sampling students from different courses unless researchers have the good fortune to have access to multiple sections of the same course at the same time. If self-reports are not used, how are researchers to develop a common measure across different courses taught by different instructors? Also, self-reports may be helpful for determining whether cognitive presence has been attained. Since the latter stages of integration and resolution involve the creation of knowledge that can be applied in future settings, it is possible that such knowledge may not emerge as a result of a single within-course inquiry but may come as a result of the cumulative effect of multiple within-course inquiries or engagements with course content. Therefore, an end of course measure of cognitive presence, such as self-reports of student learning, is appropriate and perhaps even necessary (Arbaugh, 2008).

Following Schrire’s (2006) suggestion to use multiple approaches or tools in order to reflect the complexity of the cognitive dimension and knowledge building processes and outcomes, Akyol (2009) applied several strategies to provide a detailed picture of students’ learning in online and blended communities of inquiry. She applied the PI model to depict the cognitive activity in asynchronous online discussions and used perceived learning and satisfaction as well as learning outcomes. Her research demonstrates a high level of inquiry on the discussion board parallel with high levels of perceived learning and actual learning outcomes. This finding is also contrary to the categorization of self-reported learning as belonging to the lower level of Bloom’s taxonomy by Rourke and Kanuka (p. 33). Such categorization disregards students’ metacognition. An example quote from a student about learning from online discussion in the study of Swan and Shih (2005) (which is not listed in the review article although the study used the social presence construct of the CoI framework) also challenges the categorization of the authors.

When I first read and responded to a discussion question I felt that I had written all that I could on the subject. After reading other people’s comments on the same question, I was able to take in different viewpoints and see if it was something that I agreed with or totally disagreed with. Without class discussions I would have never thought twice about the question that I had just answered. (p. 128)

The review of Rourke and Kanuka misrepresents CoI research by excluding recent research studies and including the research studies that have no relationship to the CoI framework. In this section, we focus on some of the articles which were not reflected accurately in their article. To begin with, the Richardson and Swan (2003) article focused on social presence in online courses and its effect on student perceptions (hence, single items related to satisfaction with the instructor, perceived learning, and perceived interactivity); as a result, the reference to Richardson & Swan’s study within the “Assessing Learning in CoI” section of the review is misleading at best. Correspondingly, the reference to the Richardson and Swan study within the review’s Discussion section is inaccurate. A component of the study looked at social presence in course activities other than discussion and sought to test the research hypothesis, “Course activities perceived by students as having the highest level of social presence also have high levels of students' perceived learning (p. 71).” The point was to explore the relationship between social presence and student perceptions of learning, not student learning outcomes; the variable was not designed to elicit information about deep and meaningful learning and that was not the intent of the study. Furthermore, the assertion of Rourke and Kanuka within the “Weakness of CoI” section of the review that the study associates learning more with assignments and individual projects than with discussion is not accurate, instead the research is relating the finding that the students or participants perceived social presence even within what are generally deemed more solitary tasks. Finally, and more importantly, that study was not looking at the CoI framework, but rather extending Gunawardena and Zittle’s (1997) previous work on social presence with only a brief reference to the work by Rourke, Garrison, Anderson & Archer (2001) as evidence of the advantages offered in online environments (Rourke & Kanuka, 2009; p. 69).

Similarly, the Conrad article (2005) was taken out of context. The multi-year study examined learners’ perceptions of online learning and the notion of community across the duration of a cohort program. The only mention of the CoI model can be found on page 16 of the article, within the discussion related to community and theory that looks to existing theories that demonstrate the “power of community’s application to effective pedagogy.”

Likewise, the focus of the McKlin, Harmon, Evans and Jones article (2001) was not on the CoI model but rather to answer the research question: “Can a neural network reliably categorize messages under optimum circumstances, and how can the method be improved to generate greater reliability?” During the course of the neural network investigation, the cognitive presence portion of Garrison, Anderson and Archer’s model of (2000) was used as a coding schema to test the process and results of previous studies used as a comparison of accuracy between the findings related to the neural networking process.

One final issue related to the Kanuka (2002) study (not referenced in the review) (see p. 37) is that it apparently reports that deep and meaningful learning does not occur in a community of inquiry. First, it is difficult to judge this contention as no reference is given. Secondly, however, if we examine the reported results of another reported Kanuka study (Kanuka et al., 2007), it is stated that 20% of the contributions were at the highest phases of cognitive presence. We suggest that this is a promising result, as you would expect that the integration and resolution phases of PI would elicit fewer responses than the exploration phase. That is, by definition, the exploration phase involves investigating all ideas, whereas integration seeks to combine promising ideas, and resolution seeks to focus on single or just a few solutions. Thus, one would expect a greater number of postings that could be categorized as exploration, and far fewer that could be seen as integration or resolution.

Conclusion

Certainly we have no disagreement with the following statement in the conclusion of the CoI review (we note the phrase “studies of learning” that could refer to both approaches and outcomes):

Our main suggestion applies to subsequent research studies. Briefly, we encourage researchers to conduct more substantial studies of learning in CoI. If we can identify situations in which students are and are not engaged in [emphasis added] deep and meaningful learning, we can make evidence-based suggestions about the types and quantities of teaching presence, social presence, and cognitive presence that are related to learning. (p. 44)

One interpretation of this statement is that we should be studying and linking both, approaches to learning and learning outcomes, using a variety of measures and tools. As such, this is a reasonable statement and consistent with the stated purpose “to investigate learning in communities of inquiry …” (p. 19). However, what we do view as unreasonable is to judge the CoI framework a failure based upon an absence of studies reporting deep and meaningful learning outcomes. We say this in the context that the CoI framework is a relatively new theoretical and transactional (i.e., process) model. It would seem to be particularly premature to declare the CoI framework a failure considering the theoretical nature of the framework, the studies that have validated it, the considerable number of studies that found it useful as a framework. There are a number of recent studies noted here that challenge the core claims of this review and that have provided increased insight into learning in communities of inquiry. That the CoI model has had considerable success as a theoretical framework is evidenced by the simple fact that it has been a catalyst and guide to important research in online and blended learning. To suggest that the CoI is a failure as a program of research (p. 34) is misguided and unfair at best.

Not only has the CoI model sparked large amounts of research, it remediates a lack of theory development in online distance education (Moore, 2007; Rourke & Szabo, 2002). The plethora of atheoretical research has led to a lack of theoretical grounding for the field of distance education (Garrison 2000a, 2000b; Moore, 2007; Perraton, 1988). Research done without reference to theory runs the risk of collecting questionable data based on capricious thinking. Researchers using the CoI framework adopt a theory driven and coherent research process that has explicated and confirmed definitions and relationships open to hypothesis formation and testing. From sound theoretical orientations, the steps of explanation and prediction can emerge.

Finally, it is important to put into perspective the issue of learning processes versus learning outcomes. Notwithstanding that measuring latent higher order learning outcomes is a challenging and time consuming task in itself, it also does little to inform the teaching and learning process. Cerbin (2009) makes the point that:

… we need assessment that reveals how students learn—how they interpret and make sense of the subject, where they stumble, what they do when they do not understand the material, how they respond to different instructional practices, and so on. Understanding the basis of student performance can help us identify appropriate teaching practices or approaches. (¶ 3)

The point is that understanding the educational transaction and processes of learning not only is the focus of the CoI framework but may well be of much greater value in understanding, shaping and improving the educational experience.

References

Akyol, Z. (2009). Examining teaching presence, social presence, cognitive presence, satisfaction and learning in online and blended course contexts. (Doctoral dissertation, Middle East Technical University, 2009).

Akyol, Z., & Garrison, D. R. (in press). Understanding cognitive presence in an online and blended Community of Inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology.

Akyol, Z., & Garrison, D. R. (2008). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12(3), 3-22.

Arbaugh, J. B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Networks, 11(1), 73-84.

Arbaugh, J. B. (2008). Does the community of inquiry framework predict outcomes in online MBA courses? International Review of Research in Open and Distance Learning, 9, 1-21.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S.R., Garrison, D. R., Ice, P., Richardson, J.C. & Swan, K.P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. The Internet and Higher Education, 11(3-4), 133-136.

Bangert, A. (2008). The influence of social presence and teaching presence on the quality of online critical inquiry. Journal of Computing in Higher Education, 20(1), 34-61.

Benbunan-Fich, R. & Arbaugh, J.B. (2006). Separating the effects of knowledge construction and group collaboration in learning outcomes of web-based courses. Information & Management, 43(6), 778-793.

Biggs, J.B. (1987). Student approaches to learning and studying. Paper presented at Australian Council for Educational Research, Melbourne, Australia.

Buraphadeja, V. & Dawson, K. (2008). Content analysis in computer-mediated communication: Analyzing models for assessing critical thinking through the lens of social constructivism. American Journal of Distance Education, 22(3), 130-145.

Cerbin, B. (2009). Assessing how students learn. Carnegie Perspectives. Retrieved March 2, 2009, from: http://www.carnegiefoundation.org/perspectives/sub.asp?key=245&subkey=2882

Conrad, D. (2005). Building and maintaining community in cohort-based online learning. Journal of Distance Education, 20(1), 1-2.

Cotton, D. & Yorke, J. (2006). Analyzing online discussions: What are the students learning? In Proceedings of the 23rd Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education: “Who's learning? Whose technology?” December, 2006, Sydney, Australia.

de Leng, B. A., Dolmans, D. H. J. M., Jobsis, R., Muijtjens, A. M. M., & van der Vleuten, C. P. M. (2009). Exploration of an e-learning model to foster critical thinking on basic science concepts during work placements. Computers & Education, 53(1), 1-13.

Garrison, D.R. (2000a). Theoretical challenges for distance education in the 21st Century: A shift from structural to transactional issues. International Review of Research in Open and Distance Learning, 1(1), 1-17.

Garrison, D.R. (2000b). Computer conferencing: The post-industrial age of distance education. Open Learning, 12, 3-11.

Garrison, D. R., & Anderson, T. (2003). E-Learning in the 21st century: A framework for research and practice. London: Routledge/Falmer.

Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133-148.

Garrison, D. R., & Vaughan, N. (2008). Blended learning in higher education. San Francisco: Jossey-Bass.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105.

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7-23.

Garrison, D. R., Cleveland-Innes, M., & Fung, T. (2004). Student role adjustment in online communities of inquiry: Model and instrument validation. Journal of Asynchronous Learning Networks, 8(2), 61-74.

Garrison, D.R. & Arbaugh, J.B. (2007). Researching the community of Inquiry Framework: Review, Issues, and Future Directions. The Internet and Higher Education, 10(3), 157-172.

Gunawardena, C. N., & Zittle, F. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. American Journal of Distance Education, 11(3), 8-25.

Ice, P. (2008a, April). The impact of asynchronous audio feedback on teaching, social and cognitive presence. Paper presented at the First International Conference of the Canadian Network for Innovation in Education, Banff, Alberta.

Ice, P. (2008b, August). Using online collaborative document editors to enhance student satisfaction and cognitive presence outcomes. Sloan-C Effective Practice. Retrieved March 10, 2009 from: http://www.sloan-c.org/node/1243

Ice, P., Akyol, Z. & Garrison, R. (2009, January). The relationship between instructor socio-epistemological orientations and student satisfaction with indicators of the Community of Inquiry Framework. Paper presented at the 7th Annual Hawaii International Conference on Education, Honolulu, HI.

Ice, P., Curtis, R., Phillips, P. & Wells, J. (2007). Using asynchronous audio feedback to enhance teaching presence and students' sense of community. Journal of Asynchronous Learning Networks, 11(2), 3-25.

Jonassen, D.H. (1991). Objectivism versus constructivism: Do we need a new philosophical paradigm? Educational Technology Research & Development, 39(3), 5-14.

Lim, D. H., Morris, M.L. & Kupritz, V.W. (2007). Online vs. blended lLearning: Differences in instructional outcomes and learner satisfaction. Journal of Asynchronous Learning Networks, 11(3), 27-42.

Lipman, M. (1991). Thinking in education. Cambridge: Cambridge University Press.

Meyer, K. (2004). Evaluating online discussions: Four difference frames of analysis. Journal of Asynchronous Learning Networks, 8(2), 101-114.

McKlin, T., Harmon, S.W., Evans, W., & Jone, M.G. (2001). Cognitive presence in web-based learning: A content analysis of students' online discussions. American Journal of Distance Education, 15(1) 7-23.

Moore, M.G. (2007). Research in distance education: Then, now and in the future. Keynote address at Athabasca University Staff Orientation, September, 2007, Athabasca, Alberta.

Perraton, H. (1988). A theory for distance education. In D. Sewart, D. Keegan, & B. Holmberg, (Eds.), Distance education: International perspectives (pp. 34-45). New York: Routledge.

Pisutova-Gerber, K. & Malovicova, J. (2009). Critical and higher order thinking in online threaded discussion in the Slovak context. International Review of Research in Open and Distance Learning, 10(1). Retrieved March 1, 2009 from: http://www.irrodl.org/index.php/irrodl/article/view/589/1175

Roblyer, M.D., Freeman, J. Donaldson, M.B. & Maddox, M. (2007). A comparison of outcomes of virtual school courses offered in synchronous and asynchronous formats. The Internet and Higher Education, 10(4), 261-268.

Ramsden, P. (1988). Context and strategy: Situational influences on learning. In R. R. Schmeck (Ed.) Learning strategies and learning styles (pp. 159-184). New York: Plenum.

Ramsden, P. (1992). Learning to teach in higher education. London: Routledge.

Richardson, J. C. & Swan, K. (2003). Examining social presence in online courses in relation to students' perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68-88.

Rourke, L. & Szabo, M. (2002). A content analysis of the journal of distance education 1986-2001. Journal of Distance Education, 17(1), 63-74.

Schrire, S. (2004). Interaction and cognition in asynchronous computer conferencing. Instructional Science, 32, 475-502.

Schrire, S. (2006). Knowledge building in asynchronous discussion groups: Going beyond quantitative analysis. Computers & Education, 46(1), 49-70.

Shea, P. & Bidjerano, T. (2009). Community of inquiry as a theoretical framework to foster “epistemic engagement” and “cognitive presence” in online education. Computers and Education, 52(3), 543-553.

Swan, K. & Shih, L.F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks, 9(3), 115-136.

Swan, K., Shea, P., Richardson, J., Ice, P., Garrison, D. R., Cleveland-Innes, M., & Arbaugh, J. B. (2008). Validating a measurement tool of presence in online communities of inquiry. E-Mentor, 2(24), 1-12. Retrieved from:
http://www.e-mentor.edu.pl/e_index.php?numer=24&all=1

Zehra Akyol is a research assistant at Middle East Technical University. She recently received her PhD in the field of Instructional Technology. She conducted her doctoral research at University of Calgary. Her research interests include community development in online and blended learning environments and factors affecting the development of communities of inquiry in these learning environments. E-mail: zehraakyol@gmail.com

J. B. (Ben) Arbaugh is a Professor of Strategy and Project Management at the University of Wisconsin, Oshkosh. Ben’s online teaching research has won best article awards from the Journal of Management Education and the Decision Sciences Journal of Innovative Education. His other research interests are in graduate management education and the intersection between spirituality and strategic management research. E-mail: arbaugh@uwosh.edu

M. Cleveland-Innes is a faculty member and program director in the Center for Distance Education at Athabasca University in Alberta, Canada. She teaches Research Methods and Leadership in the graduate programs of this department. Martha has received awards for her work on the student experience in online environments. In 2009 she received the President’s Award for Research and Scholarly Excellence from Athabasca University. Her work is well published in academic journals in North America and Europe. Current research interests are in the area of leadership in open and distance higher education, teaching across the disciplines in online higher education and emotional presence in online communities of inquiry. E-mail: martic@athabascau.ca

Randy Garrison is the Director of the Teaching & Learning Centre and a professor in the Faculty of Education at the University of Calgary. Dr. Garrison has published extensively on teaching and learning in higher, adult and distance education contexts. His most recent books are: E-Learning in the 21st century (2003) and Blended learning in higher education (2008). E-mail: garrison@ucalgary.ca

Phil Ice is the Director of Course Design, Research and Development at American Public University System. His research is focused on the impact of new and emerging technologies on cognition in online learning environments. Phil is also involved with seven other researchers in the United States and Canada in numerous other research initiatives related to the Community of Inquiry Framework. This research has resulted in the development of a validated instrument that captures the intersection of Teaching, Social and Cognitive presence in online environments. E-mail: pice@apus.edu

Jennifer C. Richardson is an Associate Professor in the College of Education at Purdue University. Jennifer’s research focuses on distance education, in particular online learning environments. Specific areas of research include measuring learning in online environments and the impacts of social presence and interactions on students' perceptions and learning. E-mail: jennrich@purdue.edu

Karen Swan is the Stukel Distinguished Professor of Educational Leadership at the University of Illinois Springfield. Her research is in the area of media and learning on which she has published and presented extensively. Her current interests include online learning, ubiquitous computing and data literacy. Dr. Swan received the 2006 Sloan-C award for Outstanding Achievement in Online Learning by an Individual. E-mail: kswan4@uis.edu