On Algorithms, Emotion, and Learning: A Call to Action

Algorithms, Emotion, and Learning

A call-to-action

An Urgency

It’s difficult to overstate the scale and speed at which algorithmic systems (TikTok, Instagram Reels, YouTube, search and news feeds) shape what learners see, how they feel, and what they believe. These systems are optimized for engagement, not for truth, depth, or balance. As a result, students encounter information environments that reward emotional arousal and novelty, amplify confirmation bias, and make opposing views feel threatening rather than instructive.

Ideally, schools and other important social and cultural institutions might benefit from a renewed emphasis on algorithmic awareness, emotional self-regulation in media environments, and verification practices. This page attempts to very roughly outline the research basis and invites a shift in thinking across curriculum, instruction, and assessment.

What the Research Shows

1. Emotional content spreads faster and farther

Studies of large social networks show that moral-emotional language increases virality within groups, and that false news spreads faster and more broadly than true news, primarily due to human sharing behavior—novelty and arousal drive diffusion, not bots alone. See Brady et al. (PNAS, 2017) and Vosoughi, Roy, & Aral (Science, 2018).

2. ‘Echo chambers’ persist for psychological reasons

Recommendation systems learn our preferences and supply more of the same. Meanwhile, humans prefer cognitive ease and avoidance of dissonance. Experimental evidence indicates that forced exposure to opposing views on social media can increase polarization for some users (Bail et al., PNAS, 2018), helping explain why echo chambers ‘work.’

3. Engagement metrics shape attention and belief

Industrial-scale recommenders, such as YouTube’s two-stage deep learning system, are tuned to maximize watch time and related engagement (Covington, Adams, & Sargin, RecSys, 2016). That optimization can privilege content that is captivating rather than careful, nudging learners toward quick takes over careful reasoning.

4. Attention is finite and easily taxed

Reviews of smartphone and media-multitasking research link heavy, fragmented media use to challenges in attention and interference management, with mixed but concerning evidence across executive-function domains (Wilmer, Sherman, & Chein, Frontiers in Psychology, 2017; Uncapher & Wagner, PNAS, 2018).

5. Verification is teachable—but underdeveloped

Curricula in lateral reading and civic online reasoning improve students’ ability to evaluate sources and resist misinformation, though baseline performance is low and gains require deliberate practice (Stanford History Education Group / Digital Inquiry Group; McGrew, 2023).

Implications for Schools

These dynamics are conditions of learning. A thinking-centered approach should ideally:

  • Name platform incentives explicitly and contrast them with learning goals.
  • Treat emotion as data: help students notice outrage, fear, or delight and understand how these states influence judgment.
  • Teach verification routines for AI-generated and edited media (image, audio, video) and normalize lateral reading.
  • Design performances of thinking: media audits, evidence trails, counterclaim memos, and reflections on how conclusions change.

Note:Implementation guidance (units, routines, rubrics) appears on the standards pages and accompanying resources.

Suggestions

  • Adopt a shared vocabulary for algorithms, engagement, bias, and verification across departments.
  • Build recurring classroom routines that slow thinking before sharing or reacting.
  • Assess how students use evidence and revise claims—not only what facts they recall.

Works Cited

  • Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A., & Van Bavel, J. J. (2017). Emotion shapes the diffusion of moralized content in social networks. Proceedings of the National Academy of Sciences, 114(28), 7313–7318. Link
  • Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. Link (PDF)
  • Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. F., Lee, J., Mann, M., Merhout, F., & Volfovsky, A. (2018). Exposure to opposing views on social media can increase political polarization. Proceedings of the National Academy of Sciences, 115(37), 9216–9221. Link
  • Covington, P., Adams, J., & Sargin, E. (2016). Deep Neural Networks for YouTube Recommendations. In Proceedings of the 10th ACM Conference on Recommender Systems (RecSys ’16), 191–198. Link (PDF)
  • Wilmer, H. H., Sherman, L. E., & Chein, J. M. (2017). Smartphones and cognition: A review of research exploring the links between mobile technology habits and cognitive functioning. Frontiers in Psychology, 8, 605. Link
  • Uncapher, M. R., & Wagner, A. D. (2018). Minds and brains of media multitaskers: Current findings and future directions. Proceedings of the National Academy of Sciences, 115(40), 9889–9896. Link
  • McGrew, S. (2023). Civic Online Reasoning Across the Curriculum: A Systematic Review. AERA Open, 9, 1–17. Link. See also the Stanford History Education Group Civic Online Reasoning curriculum.
© TeachThought • Last updated September 2025