AI Governance in EdTech - Spring 2026
A pdf covering the three jurisdictions that matter most, what's in force now, and what's coming before the end of the year.
AI regulation has arrived in your classroom. Most schools and EdTech founders haven’t noticed.
AI Governance in EdTech - Spring 2026
Everything an EdTech founder or school leader actually needs to know about AI regulation right now, across the EU, the US, and the UK. It’s out today as a PDF, attached to this post.
No jargon. No padding.
The picture across those three jurisdictions is genuinely complicated, and some of it has already arrived without much fanfare. The EU banned emotion recognition in educational settings in February 2025. Most EdTech products affected by that ban are still running. Updated COPPA rules hit in April 2026. The UK changed the legal basis for automated student decisions in February. And the EU’s hard deadline for high-risk AI compliance, which explicitly covers admissions, assessment, and exam proctoring, is August 2, 2026.
Who is this report for?
The briefing covers what each of those positions means practically, for two audiences:
Founders building AI-enabled EdTech products, and
Schools, colleges, and universities buying and deploying them. Both have obligations.
Both are covered separately.
There’s also a 90-day action plan on the final pages, split by audience. If you do nothing else, map your AI features against the Annex III high-risk categories this week, and ask every EdTech vendor you’re currently using whether they have removed emotion recognition functionality.
Read this and...
You’ll know exactly which AI features in EdTech products trigger high-risk classification under the EU AI Act, and what full compliance requires before August 2, 2026.
You’ll understand why schools are not just customers in this regulatory picture. Under EU law, institutions deploying non-compliant AI tools carry their own liability alongside the vendor.
You’ll get a plain-English map of where the EU, US, and UK actually stand right now on AI in education, including what is already in force and what is still coming.
You’ll find out whether any of the AI tools you’re currently running contain features that have been prohibited in EU educational settings since February 2025.
You’ll understand what the COPPA 2026 changes actually require, why the LLM API call is the most common undetected FERPA exposure in EdTech, and what to do about both.
You’ll know what Ofsted inspectors are now asking schools about AI governance, what a ratified AI policy needs to cover, and what the DfE’s January 2026 product safety standards require from vendors.
You’ll know what to watch as the Education Select Committee’s current inquiry works toward a report, and why its recommendations are likely to become statute.
The PDF is attached below. If you are a free subscriber, you can upgrade to access the report.
Note that this briefing is for informational purposes only and doesn’t constitute legal advice. For questions specific to your work, talk to a qualified lawyer in your jurisdiction.
Here is the link ….

