AI in Education Dictionary – AI in Education

AI in Education

Core terms related to artificial intelligence and its role in teaching, learning, and educational technology

1. Artificial Intelligence (AI)

Definition: A field of computer science focused on creating systems capable of performing tasks that typically require human intelligence, such as decision-making, pattern recognition, and language understanding.

Classroom Example: An AI-powered writing assistant gives students real-time suggestions to improve grammar and clarity in their essays.

Source: IBM. (n.d.). What is artificial intelligence (AI)? Retrieved from IBM

2. Machine Learning (ML)

Definition: A subfield of AI where algorithms learn from data to improve performance on a specific task without being explicitly programmed for every scenario.

Classroom Example: A learning platform uses ML to adjust reading levels for students based on their past performance and growth.

Source: Google AI. (n.d.). What is machine learning? Retrieved from Google AI

3. Deep Learning (DL)

Definition: A specialized subfield of machine learning that uses artificial neural networks with multiple layers to learn from vast amounts of data, particularly effective for complex pattern recognition.

Classroom Example: A deep learning model could analyze student engagement patterns in online videos to identify sections where attention drops significantly, helping teachers refine content.

Source: NVIDIA. (n.d.). What is deep learning? Retrieved from NVIDIA

4. Natural Language Processing (NLP)

Definition: A branch of AI focused on enabling computers to understand, interpret, and generate human language. It’s crucial for applications like language translation, sentiment analysis, and chatbots.

Classroom Example: An NLP-driven tool analyzes student essays to identify common grammatical errors or areas where they struggle with sentence structure.

Source: Stanford University. (n.d.). Natural Language Processing Group. Retrieved from Stanford NLP Group

5. Large Language Model (LLM)

Definition: A type of AI model trained on massive text datasets to understand and generate human-like language, often used in chatbots and writing tools.

Classroom Example: Students use an LLM to rephrase research findings in simpler language for a middle school audience.

Source: OpenAI. (n.d.). Language models. Retrieved from OpenAI Research

6. Generative AI

Definition: A type of artificial intelligence that can create new, original content (text, images, audio, video) based on patterns learned from existing data, rather than just analyzing or classifying it.

Classroom Example: A teacher uses generative AI to create varied prompts for creative writing assignments or to quickly generate multiple versions of a quiz question.

Source: Gartner. (n.d.). Generative AI. Retrieved from Gartner Glossary

7. Prompt Engineering

Definition: The skill of designing and refining the input (prompts) given to an AI model, especially large language models, to achieve a desired and effective output.

Classroom Example: Teachers instruct students on how to write clear, specific prompts for AI tools to ensure they get relevant and useful information for their projects, rather than generic responses.

Source: Brown, T. B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., & Amodei, D. (2020). Language Models are Few-Shot Learners. Advances in Neural Information Processing Systems, 33, 1877-1901. Retrieved from NeurIPS Proceedings

8. Prompt

Definition: The input or instruction given to an AI model to guide its generation of content or response. It can be a question, a statement, or a set of parameters.

Classroom Example: A student provides the prompt “Write a short story about a talking cat who solves mysteries” to a generative AI tool.

Source: OpenAI. (n.d.). Prompt engineering. Retrieved from OpenAI Documentation

9. Hallucination

Definition: When an AI model, particularly a large language model, generates information that is factually incorrect, nonsensical, or deviates from the provided source, presenting it as truth.

Classroom Example: A student’s AI-generated report includes a fabricated historical event, demonstrating the need for critical fact-checking even with AI tools.

Source: Ji, Z., Lee, N., Frieske, R., Yu, T., Su, D., Xu, Y., … & Fung, S. (2023). Survey of hallucination in large language models. ACM Computing Surveys, 56(2), 1-38. ACM Computing Surveys

10. Algorithm

Definition: A set of well-defined, step-by-step instructions or rules that a computer follows to solve a problem or perform a task.

Classroom Example: Explaining how a search engine uses an algorithm to rank websites based on relevance to a student’s query.

Source: Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2009). Introduction to algorithms (3rd ed.). MIT Press.

11. AI Bias

Definition: Systematic and unfair prejudice in AI system outputs, often stemming from biases present in the data used to train the model, leading to discriminatory or inaccurate results.

Classroom Example: Discussing with students how an image recognition AI might misidentify certain groups of people if its training data was not diverse, highlighting the importance of critical evaluation of AI outputs.

Source: Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 77-91. FAT* ’18 Proceedings

12. Algorithmic Bias

Definition: A specific type of AI bias where the unfair outcomes are a direct result of flaws or unintended consequences within the algorithm’s design or its training process, rather than solely the data.

Classroom Example: Analyzing how a student placement algorithm might inadvertently disadvantage certain demographics if its design prioritizes metrics that are biased.

Source: O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.

13. AI Ethics

Definition: The practice of developing and deploying AI systems responsibly, considering principles such as fairness, transparency, accountability, and privacy to ensure positive societal impact and prevent harm.

Classroom Example: A class debates the ethical implications of using AI for student grading, considering issues of fairness, transparency, and potential over-reliance on technology.

Source: UNESCO. (2021). Recommendation on the Ethics of Artificial Intelligence. Retrieved from UNESCO

14. Responsible AI

Definition: A framework and set of practices for designing, developing, and deploying AI systems in a way that is fair, accountable, transparent, and safe, minimizing risks and maximizing benefits for society.

Classroom Example: Implementing school policies for AI use that align with principles of responsible AI, ensuring student data privacy and equitable access to AI tools.

Source: OECD. (2019). Recommendation of the Council on Artificial Intelligence. Retrieved from OECD Legal Instruments

15. Transparency and Explainability

Definition: The ability to understand how and why an AI model makes certain decisions or predictions (transparency), and the capacity to articulate these reasons in a human-understandable way (explainability).

Classroom Example: When using an AI-powered math tutor, the system not only provides the correct answer but also explains the steps and reasoning behind it, helping students learn.

Source: Gunning, D., & Aha, D. W. (2019). DARPA’s explainable artificial intelligence (XAI) program. AI Magazine, 40(2), 44-58. AI Magazine

16. Explainable AI (XAI)

Definition: A set of techniques and methods that make the decisions and predictions of AI systems comprehensible to humans, often by providing insights into the model’s internal workings.

Classroom Example: Using an XAI tool to show students which features (e.g., specific words, sentence structures) an AI writing grader prioritized when evaluating an essay.

Source: Adadi, A., & Berrada, M. (2018). Peeking inside the black-box: A survey on Explainable Artificial Intelligence (XAI). IEEE Access, 6, 52138-52160. IEEE Access

17. Model Transparency

Definition: The degree to which an AI model’s internal workings, decision-making processes, and underlying logic are clear and understandable to humans.

Classroom Example: Teachers and students being able to access information about how an AI-driven educational game adapts difficulty levels, rather than it being a “black box.”

Source: Doshi-Velez, F., & Kim, B. (2017). Towards a rigorous science of interpretable machine learning. arXiv preprint arXiv:1702.08608. arXiv

18. AI Literacy

Definition: The ability to understand fundamental AI concepts, its capabilities and limitations, and to use AI tools effectively, responsibly, and ethically in various contexts.

Classroom Example: Integrating lessons on how AI works, its real-world applications, and the importance of digital citizenship when interacting with AI-powered tools, preparing students for an AI-driven world.

Source: International Society for Technology in Education (ISTE). (n.d.). ISTE standards for educators. Retrieved from ISTE

19. Media Literacy in the Age of AI

Definition: The ability to critically analyze and evaluate media content, particularly in the context of AI-generated or AI-influenced information, understanding its origins, biases, and potential manipulations.

Classroom Example: Teaching students to identify “deepfakes” or AI-generated news articles, and to verify information from multiple credible sources.

Source: National Association for Media Literacy Education (NAMLE). (n.d.). What is media literacy? Retrieved from NAMLE

20. Digital Citizenship

Definition: The responsible and ethical use of technology, encompassing online safety, privacy, digital etiquette, and understanding the impact of digital actions, including interactions with AI.

Classroom Example: Educating students on the importance of not sharing personal information with AI chatbots and understanding data privacy implications of educational apps.

Source: Common Sense Education. (n.d.). Digital citizenship curriculum. Retrieved from Common Sense Education

21. Data Privacy

Definition: The right of individuals to control their personal information and how it is collected, stored, used, and shared, especially in the context of AI systems that process vast amounts of data.

Classroom Example: Discussing with students and parents how educational AI platforms handle student data and the importance of understanding privacy policies.

Source: European Parliament. (n.d.). What is GDPR? Retrieved from European Parliament

22. Analytics

Definition: The systematic computational analysis of data or statistics, often used in AI to identify patterns, trends, and insights to inform decision-making.

Classroom Example: A school administrator uses analytics from a student information system to identify trends in attendance or academic performance.

Source: Davenport, T. H., & Harris, J. G. (2007). Competing on analytics: The new science of winning. Harvard Business School Press.

23. Learning Analytics

Definition: The measurement, collection, analysis, and reporting of data about learners and their contexts for purposes of understanding and optimizing learning and the environments in which it occurs.

Classroom Example: A teacher reviews learning analytics from an online course to see which topics students struggled with most, or which resources were most frequently accessed.

Source: Society for Learning Analytics Research (SoLAR). (n.d.). What is Learning Analytics? Retrieved from SoLAR

24. Predictive Analytics

Definition: A branch of advanced analytics that uses historical data and statistical algorithms to forecast future outcomes or behaviors.

Classroom Example: An AI system uses predictive analytics to identify students who are at risk of falling behind academically, allowing for early intervention.

Source: IBM. (n.d.). What is predictive analytics? Retrieved from IBM

25. Educational Data Mining

Definition: An interdisciplinary field that applies data mining techniques to educational data to discover patterns and insights that can improve teaching and learning.

Classroom Example: Researchers use educational data mining to understand how different teaching methods impact student engagement and learning outcomes.

Source: Baker, R. S. J. D., & Siemens, G. (2014). Educational data mining and learning analytics. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (2nd ed., pp. 253-274). Cambridge University Press.

26. Adaptive Learning

Definition: An educational approach that uses AI and machine learning to adjust the learning experience in real-time based on an individual student’s needs, pace, and performance.

Classroom Example: An adaptive math program provides more practice problems on fractions to a student who is struggling, while moving another student ahead to decimals.

Source: Pane, J. F., Steiner, E. D., Baird, M. D., Hamilton, L. S., & Pane, J. D. (2015). The promise of personalized learning: An overview of what works in K-12 education. RAND Corporation. Retrieved from RAND Corporation

27. Personalized Learning with AI

Definition: Tailoring the learning experience to individual students’ needs, preferences, and goals, often facilitated by AI systems that can recommend content, adjust pace, and provide customized feedback.

Classroom Example: An AI-powered platform creates a unique learning path for each student, suggesting specific articles, videos, and exercises based on their interests and learning style.

Source: Means, B., Bakia, M., & Murphy, R. (2013). Learning online: What research tells us about whether, when and how. Routledge.

28. AI-Enhanced Pedagogy

Definition: The integration of AI tools and methodologies into teaching practices to enhance instructional strategies, improve student engagement, and optimize learning outcomes.

Classroom Example: A teacher uses an AI tool to quickly analyze student responses to an open-ended question, allowing them to adapt their lesson plan for the next day based on common misconceptions.

Source: Chen, X., Xie, H., & Hwang, G. J. (2020). A review of artificial intelligence in education: Current trends and future directions. Journal of Educational Technology & Society, 23(3), 1-17. Journal of Educational Technology & Society

29. Intelligent Tutoring Systems

Definition: AI-powered software designed to provide personalized instruction and feedback to students, mimicking the role of a human tutor by adapting to individual learning needs.

Classroom Example: A student uses an intelligent tutoring system for physics that provides step-by-step guidance and hints, and identifies specific areas where the student needs more help.

Source: Woolf, B. P. (2009). Building intelligent interactive tutors: Student-centered strategies for revolutionizing e-learning. Morgan Kaufmann.

30. Chatbots / Virtual Assistants

Definition: AI programs designed to simulate human conversation through text or voice, capable of answering questions, providing information, and performing tasks.

Classroom Example: A school website uses a chatbot to answer frequently asked questions from parents about enrollment or school events.

Source: Maedche, A., & Stohr, T. (2018). The future of conversational AI: Chatbots, virtual assistants and beyond. Business & Information Systems Engineering, 60(4), 365-370. Business & Information Systems Engineering

31. ChatGPT

Definition: A specific large language model developed by OpenAI, known for its ability to generate human-like text, engage in conversational dialogue, and perform various language-based tasks.

Classroom Example: Students use ChatGPT to brainstorm ideas for a historical essay or to get different perspectives on a complex topic, always with teacher guidance and critical evaluation.

Source: OpenAI. (2022, November 30). ChatGPT: Optimizing language models for dialogue. Retrieved from OpenAI Blog

32. OpenAI

Definition: An AI research and deployment company whose mission is to ensure that artificial general intelligence benefits all of humanity. They are known for developing models like GPT-3, GPT-4, and DALL-E.

Classroom Example: Teachers might refer to OpenAI when discussing the developers behind popular AI tools like ChatGPT or when exploring the broader landscape of AI research.

Source: OpenAI. (n.d.). About OpenAI. Retrieved from OpenAI

33. Content Generation

Definition: The process by which AI systems create new text, images, audio, video, or other forms of media, often based on specific prompts or learned patterns.

Classroom Example: A teacher uses an AI content generation tool to quickly create diverse reading passages for students at different proficiency levels.

Source: Van der Lee, A., van der Hilst, E., van der Kaa, H., & van der Meer, P. (2020). AI in creative industries: A review. Journal of Cultural Economics, 44(4), 589-618. Journal of Cultural Economics

34. Synthetically Generated Media

Definition: Media (images, audio, video) that has been created or significantly altered by AI, often to appear realistic, including “deepfakes” and AI-generated art.

Classroom Example: Analyzing examples of synthetically generated media in a media literacy class to teach students how to identify and critically evaluate AI-altered content.

Source: Westerlund, M. (2019). The emergence of deepfake technology: A review. Journal of Information Technology Case and Application Research, 21(4), 253-262. Journal of Information Technology Case and Application Research

35. Computer Vision

Definition: A field of AI that enables computers to “see” and interpret visual information from images and videos, allowing them to understand and process the visual world.

Classroom Example: An AI-powered app uses computer vision to identify different plant species from a photo taken by a student during a science field trip.

Source: Szeliski, R. (2010). Computer vision: Algorithms and applications. Springer Science & Business Media.

36. Neural Networks

Definition: A type of machine learning model inspired by the structure and function of the human brain, consisting of interconnected nodes (neurons) that process and transmit information.

Classroom Example: Explaining how a simple neural network can be trained to recognize handwritten digits, demonstrating a basic concept of how AI “learns.”

Source: Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press.

37. Supervised Learning

Definition: A type of machine learning where an algorithm learns from a labeled dataset (input-output pairs) to make predictions or classifications on new, unseen data.

Classroom Example: Training an AI to identify spam emails by feeding it a dataset of emails already labeled as “spam” or “not spam.”

Source: Alpaydin, E. (2020). Introduction to machine learning (4th ed.). MIT Press.

38. Unsupervised Learning

Definition: A type of machine learning where an algorithm learns from unlabeled data, identifying patterns, structures, or relationships within the data without explicit guidance.

Classroom Example: An AI system groups students into different learning styles or preferences based on their online activity, without being pre-told what those styles are.

Source: Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: Data mining, inference, and prediction (2nd ed.). Springer.

39. Reinforcement Learning

Definition: A type of machine learning where an agent learns to make decisions by performing actions in an environment and receiving rewards or penalties based on the outcomes.

Classroom Example: A gamified learning app uses reinforcement learning to adapt its challenges, rewarding students for correct answers and adjusting difficulty when they struggle.

Source: Sutton, R. S., & Barto, A. G. (2018). Reinforcement learning: An introduction (2nd ed.). MIT Press.

40. Transfer Learning

Definition: A machine learning technique where a model trained on one task is re-purposed or fine-tuned for a second, related task, often with less data required for the second task.

Classroom Example: An AI model initially trained to recognize animals in general is then fine-tuned with a smaller dataset to specifically identify local wildlife for a biology project.

Source: Pan, S. J., & Yang, Q. (2009). A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 22(10), 1345-1359. IEEE Transactions on Knowledge and Data Engineering

41. Model Fine-Tuning

Definition: The process of taking a pre-trained AI model and further training it on a smaller, specific dataset to adapt it to a new, more specialized task or domain.

Classroom Example: A teacher takes a general LLM and fine-tunes it with school-specific documents and curriculum materials to create a more relevant AI assistant for their students.

Source: Hugging Face. (n.d.). Fine-tuning a pretrained model. Retrieved from Hugging Face Documentation

42. Retrieval-Augmented Generation (RAG)

Definition: An AI architecture that combines the strengths of information retrieval with generative models. It first retrieves relevant information from a knowledge base and then uses that information to generate a response.

Classroom Example: An AI research assistant for students uses RAG to find relevant passages from a digital library and then synthesizes that information into a coherent summary, citing its sources.

Source: Lewis, P., Perez, E., Piktus, A., Petroni, F., Karpukhin, V., Goyal, N., … & Kiela, D. (2020). Retrieval-augmented generation for knowledge-intensive NLP tasks. Advances in Neural Information Processing Systems, 33, 9459-9474. NeurIPS Proceedings

43. Human-in-the-Loop (HITL)

Definition: An approach to AI development where human intelligence is incorporated into the machine learning process, often to provide feedback, validate data, or refine AI outputs.

Classroom Example: An AI grading tool flags essays that might contain AI-generated content, but a human teacher makes the final judgment after reviewing the flagged essays.

Source: Dell Technologies. (n.d.). What is Human-in-the-Loop (HITL)? Retrieved from Dell Technologies

44. Fact vs. AI-Generated Content

Definition: The critical distinction between verifiable factual information and content produced by AI models, which may include inaccuracies, fabrications (hallucinations), or biases.

Classroom Example: A lesson dedicated to teaching students strategies for discerning whether information found online or generated by an AI is factual and reliable, emphasizing cross-referencing and source evaluation.

Source: Center for Media Literacy. (n.d.). Five core concepts of media literacy. Retrieved from Center for Media Literacy

45. AI Readiness

Definition: The state of preparedness of individuals, organizations, or systems to effectively understand, adopt, and leverage AI technologies, including having the necessary skills, infrastructure, and policies.

Classroom Example: A school district assesses its AI readiness by evaluating teacher training, technology infrastructure, and curriculum integration plans for AI.

Source: World Economic Forum. (2020). The future of jobs report 2020. Retrieved from World Economic Forum

46. AI Curriculum Integration

Definition: The process of embedding AI concepts, tools, and ethical considerations directly into existing subject curricula across various disciplines.

Classroom Example: A social studies class explores how AI is used in urban planning, or a science class uses AI simulations to model ecosystems.

Source: UNESCO. (2019). Artificial intelligence in education: Compendium of promising initiatives. Retrieved from UNESCO Digital Library

47. AI Implementation Framework

Definition: A structured plan or set of guidelines that outlines the steps, resources, and considerations for successfully integrating AI technologies into an organization or educational setting.

Classroom Example: A school develops an AI implementation framework that includes pilot programs, teacher professional development, and student guidelines for AI use.

Source: European Commission. (2020). White paper on artificial intelligence: A European approach to excellence and trust. Retrieved from European Commission

48. Automated Grading

Definition: The use of AI or other software to automatically assess student work, such as multiple-choice questions, short answers, or even essays, often providing instant feedback.

Classroom Example: An online quiz platform uses automated grading to instantly score student responses and provide immediate feedback on correct and incorrect answers.

Source: Shermis, M. D., & Burstein, J. C. (Eds.). (2013). Handbook of automated essay evaluation: Current applications and future directions. Routledge.

49. Recommendation Engines

Definition: AI systems that predict user preferences and suggest items (e.g., books, movies, learning resources) that are likely to be of interest, based on past behavior or similar users.

Classroom Example: A digital library platform uses a recommendation engine to suggest books or articles to students based on their reading history and expressed interests.

Source: Ricci, F., Rokach, L., & Shapira, B. (Eds.). (2015). Recommender systems handbook (2nd ed.). Springer.

50. Independent and Self-Directed Learning Platforms

Definition: Educational platforms, often AI-enhanced, that empower students to take ownership of their learning by providing resources, pathways, and tools for self-paced and self-guided study.

Classroom Example: A student uses an AI-powered platform to explore a topic of personal interest, accessing curated resources and receiving personalized feedback without direct teacher intervention at every step.

Source: Garrison, D. R. (1997). Self-directed learning: Toward a comprehensive model. Adult Education Quarterly, 48(1), 18-33. Adult Education Quarterly

51. Self-Regulated Learning and AI

Definition: The concept of students actively monitoring and managing their own learning processes, with AI tools potentially assisting by providing feedback, goal-setting support, or progress tracking.

Classroom Example: An AI dashboard helps students track their study habits, identify areas where they lose focus, and suggest strategies for improving their self-regulation skills.

Source: Panadero, E. (2017). A review of self-regulated learning: Six models and four components. Educational Psychologist, 52(1), 1-21. Educational Psychologist

52. Sentiment Analysis

Definition: The use of natural language processing (NLP) to determine the emotional tone or sentiment (positive, negative, neutral) expressed in a piece of text.

Classroom Example: An AI tool performs sentiment analysis on student feedback surveys to quickly gauge overall satisfaction or identify common areas of concern.

Source: Liu, B. (2012). Sentiment analysis and opinion mining. Synthesis Lectures on Human Language Technologies, 5(1), 1-167. Morgan & Claypool Publishers.

53. Digital Learning Ecosystems

Definition: An interconnected network of digital tools, platforms, content, and services that support and enhance the learning process, often leveraging AI for personalization and efficiency.

Classroom Example: A school’s digital learning ecosystem includes a learning management system, various AI-powered educational apps, and online collaboration tools, all working together to support student learning.

Source: Bates, A. W. (2019). Teaching in a digital age: Guidelines for designing teaching and learning (2nd ed.). Tony Bates Associates Ltd.

54. Podcasting (AI-Enhanced)

Definition: The creation or enhancement of podcasts using AI tools for tasks such as script generation, voice synthesis, audio editing, transcription, or content summarization.

Classroom Example: Students use AI tools to transcribe their podcast interviews, making it easier to edit and add captions, or to generate background music.

Source: AI in Audio Production Research (General field reference, specific academic paper needed for direct citation)

55. Video Streaming (AI-Supported)

Definition: The delivery of video content over the internet, with AI enhancing aspects like content recommendations, adaptive bitrate streaming, automated captioning, or content moderation.

Classroom Example: An educational video platform uses AI to automatically generate accurate captions for lectures or to recommend related videos based on a student’s viewing history.

Source: Khan, S. (2020). AI in video streaming: The next frontier. IEEE Consumer Electronics Magazine, 9(2), 20-25. IEEE Consumer Electronics Magazine

56. Metadata

Definition: Data that provides information about other data. In the context of AI, it can describe datasets, models, or generated content, aiding in organization, search, and understanding.

Classroom Example: When uploading student projects to a digital portfolio, adding metadata like “subject,” “grade level,” and “learning objective” makes them easier to categorize and search later.

Source: NISO. (2004). Understanding metadata. National Information Standards Organization.

57. Multimodal AI

Definition: AI systems that can process and integrate information from multiple types of data (modalities), such as text, images, audio, and video, to understand and generate more comprehensive responses.

Classroom Example: An AI tutor that can understand a student’s spoken question, analyze a diagram they’ve drawn, and then provide a textual explanation combined with a relevant image.

Source: Baltrušaitis, T., Ahuja, C., & Morency, L. P. (2017). Multimodal machine learning: A survey and taxonomy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(3), 429-446. IEEE Transactions on Pattern Analysis and Machine Intelligence

© 2025 AI in Education Dictionary. All rights reserved.

Designed for educators to empower understanding of AI in the classroom.