...
Expand | ||
---|---|---|
| ||
Introduction to AI in EducationAI is transforming education, reshaping how we teach, learn, and prepare students for the workforce. But what does that mean for faculty? How does AI work, and how can it be integrated responsibly into the classroom? In this post, we’ll explore:
What is AI?When people hear "AI," they often think of ChatGPT, but artificial intelligence extends far beyond chatbots. AI is already embedded in everyday technology, from facial recognition systems to voice assistants like Siri and Alexa. Some of the most common types of AI include:
Large Language Models (LLMs) and ChatbotsThe type of AI we focus on in this discussion is a Large Language Model (LLM)—the technology behind chatbots like ChatGPT and Microsoft’s CoPilot. These models generate human-like responses based on vast amounts of training data. While ChatGPT is the most widely recognized, several other AI chatbots are making an impact:
How Do Large Language Models Work?Many people think of AI chatbots as a "smarter Google," but they function very differently. Search engines retrieve existing content, whereas LLMs predict the next word in a sequence based on probability. Think of it like predictive text on a smartphone—except on a much larger scale. These models generate responses by drawing from massive datasets, which means:
Memory and Context WindowsOne key limitation of LLMs is their context window—the amount of recent input they can consider when generating a response. Once that window is exceeded, the AI no longer "remembers" previous interactions. For long or complex discussions, users may need to reintroduce context to keep conversations on track. What Happens to Data Entered in an AI Chat?One of the biggest concerns about AI is data privacy. What happens to the information users input into a chatbot?
Institutional vs. Public AI UseAt Lambton College, faculty and staff handling sensitive student data must use Microsoft’s CoPilot and be logged in with their College credentials. Entering student information into ChatGPT or other public AI tools violates privacy policies. For example, using ChatGPT to check a student’s essay for grammar would be a privacy violation, whereas using AI to generate discussion prompts would not. Faculty should consult IT if unsure about compliance. |
...