Government Launches £2.4 Million AI Tutoring Programme (Testing starts this summer)
On 16 April 2026, the Department for Education and the Department for Science, Innovation and Technology announced a major initiative inviting EdTech companies and AI labs to develop safe, personalised AI tutoring tools designed to improve learning outcomes, particularly for disadvantaged pupils.
Successful tools will be available nationally from 2027 and the programme targets pupils in Years 9 to 10 across English, Maths, Science and Modern Foreign Languages.
Minister for Digital Government Ian Murray said, “The best educational support outside school has too often been the privilege of those who can afford it. AI gives us a genuine opportunity to change that – to put the kind of personalised, one-to-one tutoring into the hands of all pupils, regardless of their background, and give teachers the best technology to complement their work.“
Education Minister Olivia Bailey emphasised, “Every tool must be built with teachers, tested rigorously, and held to the highest safety standards before it reaches the country’s classrooms.“
–
What this means for schools
The AI tutoring tools will adapt to individual pupils’ needs, providing extra help when they get stuck and identifying where they need more practice. Tools will be co-designed with teachers to ensure they support classroom practice, providing clear, actionable insights into how children are progressing and where they may be struggling.
Private tutoring can cost hundreds or thousands of pounds a year. Evidence shows it can accelerate learning by up to 5 months. These tools will help level the playing field and could benefit up to 450,000 disadvantaged pupils per year.
–
Safety at the forefront
Child safety is at the heart of the programme. All tools must meet rigorous UK safety standards, align with the national curriculum, and be co-designed with educators. The government confirmed:
- No identifiable pupil data will be shared publicly
- Pupil work will not be used to train AI models without parental permission
- All tools must meet DfE’s Generative AI Product Safety Standards (no anthropomorphisation, no manipulative or persuasive design, no dark patterns, robust mental health and safeguarding protocols)
- New national benchmarks are being developed to check that AI tools are accurate, age-appropriate and safe
Nav Sanghara, Woodland Academy Trust CEO, welcomed the approach, “This is a welcome step towards a more thoughtful and evidence-informed approach to AI in education. Co-designing tools with teachers is critical to ensuring they are safe, curriculum-aligned, and genuinely improve outcomes for pupils.“
–
Curriculum changes: AI literacy is now compulsory
All of this follows recent announcements from the DfE on reforms to the national curriculum, confirming that Computing should remain the primary subject responsible for digital literacy, critical thinking and foundational AI concepts. These are not “optional extras” – they are essential outcomes that all pupils should gain through Computing.
The Government announced plans to replace the existing GCSE Computer Science with a broader Computing GCSE that brings together AI and data skills. It will also explore a new post-16 Level 3 qualification in Data Science and AI.
The current curriculum does not mention AI at all. The reformed version introduces age-appropriate AI concepts from KS2 onwards, helping children to understand what AI is, how it makes decisions, where it’s already present in their lives, and how to use AI tools responsibly.
–
Growing up in an online world
The government’s March 2026 “Growing up in the online world” national consultation, launched by the Department for Science, Innovation and Technology, specifically addresses AI chatbots and children’s digital skills.
The consultation seeks views on which AI chatbot features are most risky for children, including how they mimic relationships, use flattering language, recall interactions across sessions, and generate mature content. Internet Matters found 64% of 9-17-year-olds use AI chatbots, and of those, 23% have turned to chatbots for advice, with 12% saying they use them because they have no one else to talk to.
The consultation runs until June 2026, and the government has committed to taking swift action based on its findings.
–
The Student Perspective
Oxford University Press research with UK secondary students found that over half of UK pupils are unable to spot misinformation generated by AI, and 48% want support from teachers to help them understand what AI content is trustworthy.
Among top concerns, 60% worry AI tools encourage copying rather than original work, 51% worry AI resources may be biased or reinforce untrue stereotypes, and 51% are concerned about AI accuracy, but they’re not anti-AI.
Nine in ten students say AI has helped develop a skill in relation to schoolwork, problem solving (18%), creating new ideas (15%), and revision and exam preparation (13%). However, 62% felt AI had negatively impacted skill development, with 26% stating AI made it too easy to do work for them.
–
The 2026 Policy Framework for Schools
Under emerging guidance, all schools must adopt an AI use policy by the end of 2026, aligned with DfE/Ofsted guidance covering data privacy, bias, and intellectual property.
By late 2026, the DfE is expected to update the National Curriculum with AI literacy standards defining what students should know at each Key Stage.
–
Let’s get you started: The inevitable foundation
We can’t solve your pedagogical challenge around AI literacy, that’s work you’ll do with your teaching teams, but we can help you build the infrastructure foundation your school needs.
What does that look like?
- Network capacity assessment for AI applications (including AI tutoring tools)
- Filtering configuration for appropriate AI access
- Data privacy compliance check for AI tool usage
- Cyber security audits to ensure your systems, staff and students are protected
We can ensure your infrastructure supports AI literacy now and into the future.
Get in touch using the form below, and we’ll help you get started.
SOURCES
- Edtech and AI companies invited to help build safe AI tutoring tools for disadvantaged pupils
- Curriculum and Assessment Review Final Report (November 2025)
- Growing up in the online world: a national consultation (March 2026)
- Reformed school curriculum announcement (November 2025)
- Student research on AI in the classroom
- Oxford Internet Institute – Professor Rebecca Eynon commentary
- Government curriculum announcement response
- Computing Curriculum Review 2026
- National Centre for Computing Education
- AI Literacy Training for Schools: Strategic Framework
- Internet Matters – Children’s AI chatbot usage data