CONCLUSIONS & RECOMMENDATIONS
While by no means a perfect fit, this learning experience was somewhat successful because as an adaptive expert, Colin was able to assess how aspects of what he was learning could be applicable to his current role. Most of the module content was too basic, but Colin did leave the experience with a few new trusted sources and tools, as well as some ideas on how to adapt course content so that it was more accessible for his colleagues, who are not in the field of content moderation. Observing Colin complete the module Types of Misinformation and then interviewing him afterwards revealed that while the Poynter course is very professional-looking, there is room for improvement.
My top three recommendations to improve the learning experience of this course are to:
Build in observation and reflection
Further refine the use of multimedia
Include assessments
Throughout this module there was a lack of insight into the fact checking process. Fact checking processes such as reverse image searches were mentioned, but it was never explained what this meant. Including observation in the module would help solve this. A short video of a fact checker walking you through her process, where she explains the steps that she takes to verify if a social media artifact is real, would have gone a long way in demonstrating how fact checking skills can be used to verify sources.
Reflection is the key to learning (Rosenheck, 2010), and it can be encouraged through asking simple questions throughout the learning process. Asking two questions at the end of every module, “what did you learn?” and “how can you use what you’ve learned?” would encourage learners to take a moment and process the information they have encountered, as well as assess how it will be useful to them. According to Rosenheck, it is this process of observing and reflecting that turns “experience into learning that can be applied to new situations” (2010).
The media in this course is high-quality and professional, but it was often clunky. Colin found the platform switching “where you must open 15 tabs to check other sources” to be annoying (V., personal communication, May 30, 2022). The modules would be more user friendly if the designers determined which sources, visuals, videos, graphics and tools were the most important, and then included those directly in the site, so that learners never have to leave the site to view or use them. As Miller states, “we need to look at multimedia with a critical eye, asking what it will add to any given learning activity” (2014). Extraneous multimedia that doesn’t add to the learning experience should be cut and deeper- dive materials could be included in an additional resources section, which could link to outside sources. There were so many sources and links within each module that Colin spent hours going through them, when the estimated completion time was only 25 minutes. This disconnect between estimated and actual completion time would not exist if the course content was more carefully curated.
Once the most important course content is identified, the videos and course site text could then be further built out to provide a more detailed understanding. For example, the key term “pink slime” could be defined and introduced in text, an example of a pink slime website could be opened directly in the course site, and then a video could explain why we should be wary of pink slime websites, and how to identify them.
After completing the module, Colin recognized that he would have liked a quiz to help him process the information that he learned (V., personal communication, May 30, 2022). Quizzes are most acceptable to learners when they are predictable and low stakes (Brown et al., 2014). A five-question quiz at the end of each module which tests understanding of key terms and how to use fact checking tools would be an easy way to help learners judge if they are meeting the learning outcomes, and to help them recall important information.