As of May 2026, AI in education has shifted from an experimental "novelty" phase into a regulated, core component of global school systems. The focus has moved from merely using chatbots to establishing long-term governance, AI literacy, and skills-based credentials. 🎓 1. The Move to "Governance & Trust" In 2026, the primary conversation is no longer if AI should be used, but how it is governed. • Interoperability Standards: Institutions are now demanding that edtech tools follow strict interoperability rules (like those from 1EdTech). This ensures that AI tools from different providers can "talk" to one another, making student data portable and preventing schools from being locked into a single ecosystem. • Privacy Guardrails: New rubrics, such as the TrustEd Generative AI Data Privacy Rubric, are being implemented to help schools decide which student data is safe to share with AI models and what must remain strictly private. 📜 2. Policy & Legislation Trends Legislators are moving quickly to keep up with classroom realities: • The "Human-in-the-Loop" Mandate: Several US states (including Oklahoma and Maryland) now require human oversight for high-stakes decisions. AI is legally prohibited from being the sole basis for grading, disciplinary actions, or student placement. • AI Literacy as a Graduation Requirement: Literacy is becoming a core competency. States like New Jersey and California are incorporating AI ethics and prompt engineering into the K-12 curriculum. Notably, the 2029 PISA exam (the global benchmark for 15-year-olds) is set to officially assess AI literacy for the first time. 🛠️ 3. Emerging Tools & Student Use The "Big Three" (ChatGPT, Claude, and Gemini) remain dominant, but specialized tools are gaining ground: • Personalized Tutoring: Platforms like NotebookLM and Kyron are being used to create personalized learning "sandboxes" where students can interact with their specific course materials rather than the general internet. • The Feedback Gap: A 2026 HEPI report found that while 95% of students use AI for their studies, only 38% feel their institutions provide them with the proper tools or training. This "shadow AI" use is a major focus for universities trying to bridge the gap.