Vol. 38 No. 2 (2023)
Research Articles

Beauty is Truth, Truth Beauty: Students’ Assessment of Credibility in Online Materials

Ralf St Clair
University of Victoria
Maryam Shirdel Pour
University of Victoria, Canada
James Nahachewsky
University of Victoria, Canada

Published 2023-12-21

How to Cite

St Clair, R., Shirdel Pour, M., & Nahachewsky, J. (2023). Beauty is Truth, Truth Beauty: Students’ Assessment of Credibility in Online Materials. International Journal of E-Learning & Distance Education Revue Internationale Du E-Learning Et La Formation à Distance, 38(2). https://doi.org/10.55667/10.55667/ijede.2023.v38.i2.1289


This study discusses the findings of a survey designed to capture students’ allocations of credibility to online materials resembling social media posts. The survey respondents were 1,019 undergraduate students at a medium-sized Canadian university. The students came from a range of programs and years of study in those programs. The survey instrument presented varying stimuli to students to see how their scores varied, and then asked students to explain their scoring. A number of significant dynamics emerged, such as the students’ tendency to give lower credibility scores to poorly presented information, even if the information was factual, and to explain information by referring to previous knowledge. These dynamics varied little by area or year of study, which suggests that presentation should be recognized as a powerful heuristic in online credibility assessment.

Keywords: credibility, social media, undergraduate, survey research

La beauté est la vérité , la vérité est la beauté : L'évaluation par les étudiants de la crédibilité des documents en ligne

Résumé : Cette étude présente les résultats d'une enquête visant à déterminer la crédibilité que les étudiants accordent aux documents en ligne ressemblant à des messages de médias sociaux. Les répondants à l'enquête étaient 1 019 étudiants de premier cycle d'une université canadienne de taille moyenne. Les étudiants provenaient d'un éventail de programmes et de différentes années d'études dans ces programmes. L'instrument d'enquête présentait différents stimuli aux étudiants afin de voir comment leurs scores variaient, et demandait ensuite aux étudiants d'expliquer leur notation. Un certain nombre de dynamiques significatives sont apparues, telles que la tendance des étudiants à accorder des scores de crédibilité plus faibles aux informations mal présentées, même lorsqu'elles sont factuelles, et à expliquer les informations en se référant à des connaissances antérieures. Ces dynamiques varient peu en fonction du domaine ou de l'année d'étude, ce qui suggère que la présentation devrait être reconnue comme une heuristique puissante dans l'évaluation de la crédibilité en ligne.

Mots-clés : crédibilité, médias sociaux, premier cycle universitaire, enquête de recherche



  1. American Psychological Association. (2023). Learning. In APA dictionary of psychology. https://dictionary.apa.org/learning
  2. Bashir, I., Malik, A., & Mahmood, K. (2022, June 29). Measuring personal and academic differences in students’ perceived social media credibility. Digital Library Perspectives, 38(3), 251-262. https://doi.org/10.1108/DLP-06-2021-0048
  3. Borah, P., & Xiao, X. (2018). The importance of ‘likes’: The interplay of message framing, source, and social endorsement on credibility perceptions of health information on Facebook. Journal of Health Communication, 23(4), 399-411. https://doi.org/10.1080/10810730.2018.1455770
  4. Bråten, I., Braasch, J. L. G., Strømsø, H. I., & Ferguson, L. E. (2015). Establishing trustworthiness when students read multiple documents containing conflicting scientific evidence. Reading Psychology, 36(4), 315-349. https://doi.org/10.1080/02702711.2013.864362
  5. Buhlmann, H., & Gisler, A. (2006). A course in credibility theory and its applications. Springer Publishing.
  6. Fogg, B. J., Soohoo, C., Danielson, D. R., Marable, L., Stanford, J., & Tauber, E. R. (2003, June). How do users evaluate the credibility of Web sites? A study with over 2,500 participants. In Proceedings of the 2003 conference on designing for user experiences (DUX '03) (pp. 1–15). Association for Computing Machinery, New York. https://doi.org/10.1145/997078.997097
  7. Greene, J. A., Copeland, D. Z., Deekens, V. M., & Freed, R. (2018). Self-regulated learning processes and multiple source use in and out of school. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.), Handbook of multiple source use, (pp. 320–338). Routledge. https://doi.org/10.4324/9781315627496
  8. Hämäläinen, E. K., Kiili, C., Räikkönen, E., & Marttunen, M. (2021). Students’ abilities to evaluate the credibility of online texts: The role of internet-specific epistemic justifications. Journal of Computer Assisted Learning, 37(5), 1409–1422. https://doi.org/10.1111/jcal.12580
  9. Herrero-Diz, P., Conde-Jiménez, J., & Reyes de Cózar, S. (2020). Teens’ motivations to spread fake news on WhatsApp. Social Media + Society, 6(3), 1–14. https://doi.org/10.1177/2056305120942879
  10. Johnson, T. J., & Kaye, B. K. (2014, July 1). Credibility of social network sites for political information among politically interested internet users. Journal of Computer-Mediated Communication, 19(4), 957–974. https://doi.org/10.1111/jcc4.12084
  11. Kang, B., O’Donovan, J., & Höllerer, T. (2012). Modeling topic specific credibility on Twitter. In Proceedings of the 2012 ACM international conference on intelligent user interfaces (pp. 179–188). Association for Computing Machinery. https://doi.org/10.1145/2166966.2166998
  12. Keats, J. (2023). Ode on a Grecian urn and other poems. Poetry Foundation. https://www.poetryfoundation.org/poems/44477/ode-on-a-grecian-urn
  13. Kiili, C., Coiro, J., & Räikkönen, E. (2019). Students’ evaluation of information during online inquiry: Working individually or in pairs. Australian Journal of Language and Literacy, 42, 167–183. https://doi.org/10.1007/BF03652036
  14. Kim, C. & Brown, W. J. (2015, December). Conceptualizing credibility in social media spaces of public relations. Public Relations Journal, 9(4). https://prjournal.instituteforpr.org/wp-content/uploads/2016v09n04KimBrown.pdf
  15. Kohnen, A. M., Mertens, G. E., & Boehm, S. M. (2020). Can middle schoolers learn to read the web like experts? Possibilities and limits of a strategy-based intervention. Journal of Media Literacy Education, 12(2), 64–79. https://doi.org/10.23860/JMLE-2020-12-2-6
  16. Li, J., Kuutila, M., Huusko, E., Kariyakarawana, N., Savic, M., Ahooie, N. N., Hosio, S., & Mäntylä, M. (2023). Assessing credibility factors of short-form social media posts: A crowdsourced online experiment. In Proceedings of the 15th biannual conference of the Italian SIGCHI chapter, Torino, Italy (pp. 1–14). https://doi.org/10.1145/3605390.3605406
  17. Marsh, E. J., Balota, D. A., & Roediger, H. L., III. (2005). Learning facts from fiction: effects of healthy aging and early-stage dementia of the Alzheimer type. Neuropsychology, 19(1), 115–29. https://doi.org/10.1037/0894-4105.19.1.115
  18. Marsh, E. J., & Umanath, S. (2014). Knowledge neglect: Failures to notice contradictions with stored knowledge. In D. N. Rapp & J. L. G. Braasch (Eds.), Processing inaccurate information: Theoretical and applied perspectives from cognitive science and the educational sciences (pp. 161–180). MIT Press.
  19. Martin, K. N., & Johnson, M. A. (2010). Digital credibility and digital dynamism in public relations blogs. Visual Communication Quarterly, 17(3), 162-174. https://doi.org/10.1080/15551393.2010.502475
  20. Metzger, M. J., & Flanagin, A. J. (2013, December). Credibility and trust of information in online environments: The use of cognitive heuristics. Journal of Pragmatics, 59(Part B), 210–220. https://doi.org/10.1016/j.pragma.2013.07.012
  21. Mphahlele, R. S., Makgato-Khunou, P., Tshephe, G., Sethusha, M. J., Tshesane, M. M., Wright, R., & Denzil, C. (2023). First-year student’s e-readiness to use learning management system: COVID-19 realities. International Journal of E-Learning & Distance Education, 38(1). https://doi.org/10.55667/10.55667/ijede.2023.v38.i1.1266
  22. Perfetti, C. A., Rouet, J.-F., & Britt, M. A. (1999). Towards a theory of documents representation. In H. van Oostendorp, & S. R. Goldman (Eds.), The construction of mental representations during reading (pp. 99–122). Erlbaum.
  23. Rosenblatt, L. M. (1982). The literary transaction: evocation and response. Theory Into Practice, 21(4), 268–277. http://www.jstor.org/stable/1476352
  24. Rouet, J.-F., Saux, G., Ros, C., Stadtler, M., Vibert, N., & Britt, M. A. (2021). Inside document models: Role of source attributes in readers’ integration of multiple text contents. Discourse Processes 58(1), 60–79. https://doi.org/10.1080/0163853X.2020.1750246
  25. Salmerón, L., Kammerer, Y., & Delgado, P. (2018). Non-academic multiple source use on the Internet. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.), Handbook of multiple source use (pp. 285–302). Routledge. https://doi.org/10.4324/9781315627496
  26. Schul, Y., & Mayo, R. (2014). Discounting information: When false information is preserved and when it is not. In D. N. Rapp, & J. L. G. Braasch (Eds.), Processing inaccurate information: Theoretical and applied perspectives from cognitive science and the educational sciences, (pp. 203–221). MIT Press.
  27. Shabani, A. & Keshavarz, H. (2022). Media literacy and the credibility evaluation of social media information: Students’ use of Instagram, WhatsApp and Telegram. Global Knowledge, Memory and Communication, 71(6/7), 413431. https://doi.org/10.1108/GKMC-02-2021-0029
  28. Sue, V. M., & Ritter, L. A. (2012). Conducting online surveys (2nd ed.). SAGE Publications. https://doi.org/10.4135/9781506335186