AI and the Digital Divide: Ensuring Every Child Benefits, Not Just the Privileged Few

AI and the Digital Divide: Ensuring Every Child Benefits, Not Just the Privileged Few

Executive Summary

The accelerating adoption of artificial intelligence (AI) in schools and homes promises to transform teaching and learning. Yet it also risks amplifying existing inequalities if access, skills and protections are not shared fairly. Nearly half of UK households with children (45 %) fall below the minimum digital living standard[1], meaning they lack adequate devices, connectivity or skills. Globally, two‑thirds of school‑age children—around 1.3 billion—have no internet connection at home[2]. These gaps mean pupils from disadvantaged backgrounds fall behind just as AI is becoming embedded in homework, assessments and personalised tutoring tools. The moral and educational duty of trusts and policy‑makers is clear: ensure that every child, not only the privileged few, can benefit from AI.

This article frames equity as a systemic and ethical imperative. It synthesises research from UNESCO, OECD, UNICEF, Ofcom, the Education Endowment Foundation (EEF) and the Good Things Foundation, alongside UK policy developments such as the Online Safety Act and the Department for Education’s (DfE) generative‑AI guidance. We propose a five‑part Equity‑by‑Design framework to close the divide—Access, Ability, Assurance, Application and Accountability—and offer concrete strategies, metrics and case studies. Our goal is to equip multi‑academy trusts, school leaders and policy‑makers with a blueprint to ensure AI strengthens, rather than erodes, educational equity.

1. The New Digital Divide: From Devices to Dignity

1.1 A Global Perspective

UNESCO’s first global guidance on generative AI in education warns that AI systems can exacerbate inequalities unless they are designed with inclusivity at their core. The guidance calls on governments to ensure universal internet access, eliminate bias in AI systems, monitor and validate AI outputs, build teacher capacity and promote plural opinions[3]. These recommendations recognise that technology can harm as well as help. Evidence from the OECD supports this caution: excessive use of devices for leisure undermines attention and learning, with students distracted by peers’ phones scoring significantly lower in mathematics[4]. These insights underscore that simply placing AI in classrooms without addressing access, pedagogy and safeguards can deepen rather than reduce the divide.

The digital canyon is widest in low‑income countries. UNICEF and the International Telecommunication Union (ITU) report that two‑thirds of school‑age children—1.3 billion—lack an internet connection at home[2]. The digital divide mirrors economic divides: only 16 % of children from poor households have home internet, compared with 58 % in rich households[2]. Geographic disparities are stark; in sub‑Saharan Africa, 95 % of children remain unconnected[2]. Lack of connectivity isolates children during school closures and prevents them from competing in a digital economy[2]. Addressing these inequalities requires infrastructure investment and innovative financing, such as ITU/UNICEF’s Giga initiative, which has mapped more than 800 000 schools and is developing business models to connect them[5].

1.2 The UK Picture

Britain is often described as a digital nation, yet its divide is glaring. Thirty‑four per cent of parents with school‑age children say their child lacks continuous access to an appropriate device at home for online schoolwork, and 13 % cannot resolve the problem[6]. A University of Liverpool, Loughborough University and Good Things Foundation study found that 45 % of households with children fall below the minimum digital living standard (MDLS)[1]. Families from low socio‑economic groups, minority ethnic communities and parents with disabilities are most likely to be excluded[1]. Nearly one in five households lack necessary equipment or services[1], while 38 % of households lack essential online skills[1]. Digital deprivation amplifies other exclusions: children cannot access homework portals, parents cannot apply for benefits online and families miss out on social tariffs.

The Good Things Foundation’s Digital Nation 2025 offers a recent snapshot. It reports that 3.7 million families are below the minimum digital living standard and 7.9 million adults lack basic digital skills[7]. 1.9 million households struggle to afford mobile contracts and 1.6 million adults have no smartphone, tablet or laptop[8]. For those offline, 33 % find it hard to use council services and 29 % of older people feel left behind by services moving online[9]. Only 10 % of eligible households have taken up social tariffs[10]. Good Things also reveals that 7000+ community access points (digital inclusion hubs) have distributed 64 000 devices and saved carbon equivalent to 537 000 trees[11]. These statistics highlight both the scale of exclusion and the potential of community action.

1.3 Affordability & Skills

Affordability is a major barrier. Ofcom’s 2024–25 communications affordability tracker found that 26 % of UK households—around six million—struggled to afford communication services, with 9 % of mobile customers and 8 % of broadband subscribers unable to pay[12]. Only one‑third of eligible decision‑makers were aware of social tariffs[12]. This low awareness suggests that signposting through schools and trusts could dramatically increase uptake. Meanwhile, the Lloyds Consumer Digital Index 2023 reports that 16 % of adults (about 8.5 million people) lack foundation‑level digital skills; among those without these skills, 59 % do not own a device and 62 % lack home internet[13]. Since parents and teachers provide much of children’s digital guidance, gaps in adult skills hinder children’s progress.

1.4 AI Adoption and Emerging Risks

Generative AI is penetrating homes and classrooms at unprecedented speed. Internet Matters’ 2025 study found that 44 % of children actively engage with generative‑AI tools and 54 % of child AI users employ them for homework or schoolwork[14]. Yet 60 % of parents said their child’s school had not informed them about plans to use generative AI, and the same proportion of schools had not spoken to pupils about AI[15]. Vulnerable children are particularly at risk; 41 % of vulnerable children used ChatGPT to complete homework[16]. The research advises government to ensure digital inclusion and equitable access as part of AI guidance[17], and warns that children on free school meals have less access to technology and data[18].

Ofcom’s 2024 media use report (not reproduced here for brevity) echoes this picture, finding that nearly half of children use AI tools, often with little oversight. Without deliberate strategies, AI will become another domain where privileged students gain support while others fall further behind.

2. Moral and Policy Duty of Trusts

2.1 Equity as a Moral Imperative

Education trusts and governing bodies have a legal duty to provide equal opportunities. In the context of AI, this duty extends beyond physical access to safeguarding, curriculum design and long‑term outcomes. UNESCO and UNICEF emphasise that AI must respect human rights, promote inclusion and protect children’s dignity[3][2]. They urge policymakers to address affordability, safety and skills alongside connectivity[2]. Without these measures, AI will reinforce the digital divide.

2.2 Regulatory Landscape: Online Safety Act and Ofcom Codes

The UK’s Online Safety Act (2023) imposes a statutory duty of care on online platforms to protect children. Ofcom’s Protection of Children Codes of Practice, finalised in April 2025, translate this duty into concrete measures. Drawing on consultations with 27 000 children and 13 000 parents, the codes demand a safety‑first approach[19]. Key measures include:

1.        Safer feeds: Services using recommendation algorithms must filter harmful content so that children’s feeds are not seeded with suicide, self‑harm, eating disorders, pornography or extremist material[20].

2.        Effective age checks: High‑risk services must implement robust age assurance to prevent children from accessing inappropriate content[21].

3.        Fast action: Platforms must quickly review and remove harmful content and have named accountability structures[22].

4.        Child control & support: Children should have tools to block, mute or decline group chats and receive support after encountering harmful content[23].

5.        Strong governance: Services must appoint a named person responsible for children’s safety and periodically review risks[24].

These codes complement existing requirements for pornographic sites and create a new era of child‑safety regulation[25]. For schools and trusts, they signal that AI platforms integrated into learning must meet safety standards. They also underline the importance of robust filtering and monitoring systems in schools.

2.3 Government Support and SEND Pilot

The DfE recognises AI’s potential to support learning for all pupils. In its 2025 policy paper on generative AI, it argues that safe AI adoption, accompanied by the right infrastructure, can help every child achieve regardless of background[26]. To address barriers, the government launched a £1.7 million pilot of assistive‑technology lending libraries in June 2025, aimed at special educational needs and disabilities (SEND) pupils. Up to 4 000 schools will be able to borrow tools such as reading pens and dictation devices to support dyslexia, autism and ADHD[27]. The pilot uses a “try before you buy” model, enabling schools to test devices before investing[28]. Early results are promising: 86 % of staff reported improved behaviour and 89 % saw increased confidence among SEND pupils after introducing assistive technology[29]. This pilot illustrates how shared device libraries and targeted interventions can overcome access barriers and support inclusion.

2.4 Pedagogy First: Lessons from the EEF

Technology alone does not improve learning; pedagogy does. The EEF’s Using Digital Technology to Improve Learning guidance stresses that schools must consider how technology will improve teaching and learning before introducing it[30]. The report notes that buying a tablet for every pupil is unlikely to boost attainment unless devices are used purposefully to increase practice, feedback and precise assessment[31]. In other words, digital tools succeed when they are integrated into evidence‑based teaching strategies. The EEF identifies four dimensions where technology can have impact: (1) improving the quality of explanations and modelling; (2) enhancing the quantity and quality of pupil practice; (3) enabling better assessment and feedback; and (4) facilitating data‑driven decision‑making[32]. Importantly, the EEF emphasises that implementation and teacher training are crucial[33]. This aligns with OECD findings that unsupervised device use can harm learning[34].

3. Equity‑by‑Design Framework

To close the digital divide while embracing AI, we propose a five‑part Equity‑by‑Design (EBD‑AI) model. Adapted from global guidance and the research pack, this framework helps trusts systematically address access, skills, safety, pedagogy and accountability.

3.1 Access: Devices, Data and Connectivity

Devices and data are the entry ticket to AI. Across the UK, 0.6 million young people lack home internet or a suitable device (Good Things Foundation figure reported in multiple local authority reports). To address this, trusts should:

·      Establish device libraries. Draw inspiration from the SEND assistive‑technology pilot and the Rochdale Digitech Library, which lets residents borrow laptops and tablets for up to nine weeks and provides free SIM cards via the National Databank[35]. Trusts can create similar schemes, prioritising pupil‑premium and SEND learners. A “try before you buy” model reduces waste and ensures devices meet pupils’ needs[28].

·      Partner with refurbishers and councils. Good Things Foundation’s digital inclusion network has collected 64 000 devices for redistribution[11]. Schools can work with local authorities and corporate donors to refurbish and distribute laptops or tablets, reducing e‑waste and bridging gaps.

·      Tackle data poverty. The National Databank offers free mobile SIM cards and data packages for people without internet access. Since its launch, over 250 000 data packages have been distributed and 89 % of recipients feel more digitally able[36]. Trusts and community hubs should join the Databank and proactively enrol eligible families. Additionally, schools can promote social tariffs; only 10 % of eligible households currently use them[10].

·      Set connectivity standards. Following ITU/UNICEF’s Meaningful School Connectivity metrics, trusts should aim for at least 50 Mbps per 30 concurrent learners and latency below 50 ms. Where broadband is insufficient, schools can provide vouchers for mobile data or use satellite broadband pilots.

3.2 Ability: AI Literacy and Teacher Capacity

Teachers, parents and pupils need new skills to navigate AI. The Lloyds Consumer Digital Index shows that 16 % of adults lack foundation‑level digital skills[13]. Without adult expertise, children cannot receive safe guidance. Internet Matters research reveals that while 44 % of children actively use generative AI, 60 % of parents and 60 % of schools have not discussed AI use with children[15]. To build ability:

·      Provide AI literacy modules for pupils, covering how generative models work, prompt engineering, fact‑checking, bias, privacy, mental‑health impacts and ethical use. UNESCO’s guidance emphasises developing AI competencies for learners and avoiding dependency on proprietary systems[3].

·      Invest in teacher CPD. Professional development should include designing AI‑infused lessons aligned with the EEF’s pedagogical recommendations, understanding algorithmic bias, and safeguarding. Teachers should practise with low‑stakes examples before integrating AI into high‑stakes assessments.

·      Engage parents. Offer workshops using resources from organisations like Internet Matters, which advise parents to talk with children about AI and explore tools together[37]. Parental involvement increases awareness and ensures consistent messaging across school and home.

·      Target support to priority groups. Vulnerable children on free school meals have less access to AI tools[18]. Trusts should allocate more devices, data and training to these pupils and monitor usage gaps.

3.3 Assurance: Safety, Privacy and Rights

As AI systems become ubiquitous, children’s safety and rights must be central. The Online Safety Act and Ofcom codes require platforms to filter harmful content, implement age checks and provide reporting mechanisms[38]. Trusts must ensure that AI tools used in schools comply with these regulations. Key actions include:

·      Implement robust filtering and monitoring. Ensure AI chatbots and search functions used by pupils cannot generate harmful or explicit content. Configure recommender algorithms to block hateful or dangerous outputs[39].

·      Strengthen data privacy. Protect students’ data under GDPR and the Data Protection Act. Limit data collection, anonymise records and demand transparency from AI vendors about data usage.

·      Develop risk‑assessment protocols. Conduct regular audits of AI tools to assess biases and hallucination risks. Involve safeguarding leads and IT security specialists.

·      Align policies with children’s rights. UNICEF emphasises that AI must ensure children’s participation, protection and provision rights[2]. Trust policies should include explicit commitments to equity, accessibility and wellbeing.

3.4 Application: Pedagogy‑First Use of AI

AI has enormous potential to personalise learning, provide feedback and free teacher time. However, as the EEF warns, technology must serve pedagogy, not the other way around[30]. Trusts should:

·      Align AI tools with evidence‑based pedagogy. Use generative AI to enhance explanations (e.g., summarising complex concepts), modelling (e.g., scaffolding examples), retrieval practice (e.g., low‑stakes quizzes) and formative assessment[32].

·      Avoid distraction and misuse. OECD data show that unsupervised device use leads to distraction and lower performance[4]. Schools should set clear rules for device use, integrate AI into structured learning and turn off notifications during lessons.

·      Use AI to support differentiation. Adaptive tutoring systems can help struggling learners catch up and challenge advanced students. For SEND pupils, tools like reading pens and dictation software, as piloted by the DfE, can transform access to the curriculum[27].

·      Collaborate with AI ethically. Encourage students to treat AI as a co‑pilot rather than a solution. Teach them to verify outputs, cite sources and understand limitations.

3.5 Accountability: KPIs and Public Dashboards

Equity requires transparency and measurement. Trusts should adopt Key Performance Indicators (KPIs) and publish dashboards to track progress. Suggested metrics include:

·      Access metrics: Device‑to‑pupil ratio; number of devices loaned via libraries; proportion of pupils accessing the National Databank; average bandwidth per learner; social‑tariff uptake rates.

·      Ability metrics: Percentage of staff completing AI CPD; percentage of pupils completing AI literacy modules; number of parent workshop attendees; skills improvement (measured via digital‑skills assessments).

·      Assurance metrics: Number of safeguarding incidents related to AI; compliance with Ofcom filtering standards; number of data privacy breaches.

·      Application metrics: AI tool usage rates by pupil‑premium status; improvement in reading and maths attainment (using EEF’s months‑of‑progress framework); reduction in teacher workload through automation.

·      Programme ROI metrics: Cost per additional device distributed; cost per gigabyte provided via Databank; improvements in attendance and attainment among participants (based on Digital Poverty Alliance or local evaluations).

These KPIs can be incorporated into public dashboards to build accountability and demonstrate return on investment. Trusts should aim to conduct randomised evaluations where feasible, following EEF evaluation protocols.

4. What Works: Case Studies and ROI

4.1 Device Lending and Libraries

Rochdale Digitech Library provides a model of community‑based lending. Residents can borrow laptops or tablets for up to nine weeks and renew every three weeks[35]. The library also provides free SIM cards and data for up to six months through the National Databank[35]. Such programmes reduce digital exclusion and build trust between schools and communities. Impact data from the Digital Poverty Alliance’s Tech4Families programme indicate that providing devices and connectivity improves attendance, homework completion and parental engagement (a detailed evaluation is beyond this paper’s scope but should be referenced).

4.2 National Databank and Data Poverty Relief

The National Databank is a flagship digital inclusion scheme. It has distributed over 250 000 data packages to people who cannot afford internet access[36], and 89 % of recipients feel more digitally able or safe[36]. Trusts can apply to the Databank to secure SIM cards for pupils on free school meals. Partnerships with telecom operators ensure the scheme is scalable.

4.3 Assistive‑Technology Lending Pilot

The DfE’s assistive‑technology pilot has shown how targeted interventions yield high impact. Lending libraries will be set up in up to 32 local authorities, allowing up to 4 000 schools to borrow devices such as reading pens and dictation tools[27]. In early trials, 86 % of staff saw improved behaviour and 89 % reported increased confidence among SEND pupils[29]. These results not only support inclusion but also free teacher time to focus on instruction[40].

4.4 Community Hubs and Libraries

Community centres and libraries play an increasing role in digital inclusion. Good Things Foundation’s National Digital Inclusion Network spans over 3 500 digital inclusion hubs[36]. These hubs provide device loans, data packages and digital‑skills training, and often host parent workshops. Local examples such as the Rochdale library show how hubs can be integrated into trust strategies.

4.5 ROI and Sustainability

Calculating return on investment (ROI) helps justify spending. Suppose a trust invests £100 000 to purchase 200 laptops (£500 each) and uses the National Databank to provide data for 200 pupils (£10 per month). If the programme leads to a 3 percentage‑point increase in attendance, two months’ additional progress in reading (valued at roughly £1 000 per pupil in lifetime earnings) and reduced teacher workload (freeing 0.2 FTE per class), the cost per positive outcome is modest. Moreover, refurbished devices and community partnerships can halve hardware costs. While this illustrative example simplifies complex factors, it demonstrates that digital inclusion programmes deliver high social returns.

5. Roadmap and KPIs for Trusts

Achieving equity is a multi‑year process. Below is a suggested roadmap aligned with the Equity‑by‑Design framework. Trusts should adapt timelines based on resources and context.

Stage 1 (Quarter 1–2): Audit and Plan

1.        Conduct a digital equity audit: survey pupils, parents and staff to assess device access, connectivity, digital skills and AI usage.

2.        Map existing resources: identify community hubs, local libraries, corporate partners and funding streams (e.g., pupil premium, corporate donations, local authority grants). Engage with the National Databank and refurbishers.

3.        Establish governance: appoint a digital equity lead and form a steering group including IT, safeguarding, SEND and curriculum leads.

4.        Set targets: define baseline metrics and ambitious but realistic KPIs for each EBD‑AI dimension.

Stage 2 (Quarter 3–4): Pilot and Build Capacity

1.        Launch device‑lending pilots: start with one or two schools; prioritise pupils without devices and SEND pupils. Integrate evaluation from the outset.

2.        Deliver AI literacy programmes: run teacher CPD and pupil modules, using UNESCO guidance and Internet Matters resources.

3.        Implement safety protocols: deploy filtering and monitoring tools; ensure AI platforms comply with Ofcom codes. Update policies and consent forms.

4.        Engage parents: host workshops on AI safety and digital skills. Provide information about social tariffs and the National Databank.

Stage 3 (Year 2): Scale and Integrate

1.        Expand lending schemes across the trust, standardising device management and retrieval processes.

2.        Integrate AI into the curriculum in line with the EEF’s pedagogical recommendations, focusing on retrieval practice, modelling and feedback.

3.        Enhance connectivity: upgrade infrastructure to meet meaningful school connectivity standards and provide remote access solutions for pupils without home broadband.

4.        Public dashboard: publish progress against KPIs. Use data to identify persistent gaps and reallocate resources.

Stage 4 (Year 3 and Beyond): Evaluate and Advocate

1.        Conduct impact evaluations: partner with external researchers or use EEF‑style randomised trials to measure effects on attainment, engagement and wellbeing. Compare outcomes between participants and control groups.

2.        Refine and innovate: update AI tools and training to reflect technological advances. Explore emerging models such as open‑source AI or federated learning to reduce vendor lock‑in.

3.        Advocate for systemic change: share lessons with policy‑makers; campaign for broadband universal service, device recycling incentives and digital‑skills funding. Collaborate with other trusts and national organisations to influence policy.

6. Conclusion: AI as a Test of Collective Responsibility

AI is not just a technological innovation; it is a moral stress test for educational systems. It exposes the consequences of neglecting digital inclusion and magnifies existing inequities. As UNESCO, OECD and UNICEF warn, ignoring access, safety and pedagogy risks widening the gap[3][4][2]. However, when designed for equity and implemented thoughtfully, AI can amplify human agency, personalise learning and enable all children to thrive.

Trusts have the power—and duty—to ensure that AI benefits every child, not just those born into privilege. By adopting an Equity‑by‑Design model, investing in devices and data, building AI literacy, safeguarding children and measuring impact, we can make AI a tool for social mobility. Nearly half of UK families with children currently fall below the minimum digital living standard[1]; this statistic should galvanise us. The road ahead requires collaboration among educators, policy‑makers, tech companies, parents and communities. Together, we can bridge the digital divide and build an inclusive digital future.


Appendix: Digital Divide Metrics

Below is a horizontal bar chart summarising key digital divide metrics drawn from the Good Things Foundation and Ofcom. Families below the Minimum Digital Living Standard (MDLS), adults lacking basic digital skills, and households struggling to afford mobile contracts or devices illustrate the scale of exclusion. The chart highlights the urgency of interventions in devices, skills and affordability.

 


[1] Nearly half of UK families excluded from modern digital society, study finds | Digital Britain | The Guardian

https://www.theguardian.com/technology/2024/mar/17/half-uk-families-excluded-modern-digital-society-study

[2] [5] Two thirds of the world’s school-age children have no internet access at home, new UNICEF-ITU report says

https://www.unicef.org/press-releases/two-thirds-worlds-school-age-children-have-no-internet-access-home-new-unicef-itu

[3] Generative AI has disrupted education. Here’s how it can be used for good – UNESCO | World Economic Forum

https://www.weforum.org/stories/2023/09/generative-ai-education-unesco/

[4] [34] Students, digital devices and success (EN)

https://www.oecd.org/content/dam/oecd/en/publications/reports/2024/05/students-digital-devices-and-success_621829ff/9e4c0624-en.pdf

[6] Childrens Media literacy report 2024

https://www.ofcom.org.uk/siteassets/resources/documents/research-and-data/media-literacy-research/children/children-media-use-and-attitudes-2024/childrens-media-literacy-report-2024.pdf

[7] [8] [9] [10] [11] Digital Nation | The UK's Digital Divide | Good Things Foundation

https://www.goodthingsfoundation.org/policy-and-research/research-and-evidence/research-2024/digital-nation

[12] Communications Affordability Tracker - Ofcom

https://www.ofcom.org.uk/phones-and-broadband/saving-money/affordability-tracker

[13] lloyds-consumer-digital-index-2023-report.pdf

https://www.ipsos.com/sites/default/files/ct/publication/documents/2023-11/lloyds-consumer-digital-index-2023-report.pdf

[14] [15] [16] [17] [18] [37] Generative AI in education: Kids & parents views | Internet Matters

https://www.internetmatters.org/hub/research/generative-ai-in-education-report/

[19] [20] [21] [22] [23] [24] [25] [38] [39] New rules for a safer generation of children online - Ofcom

https://www.ofcom.org.uk/online-safety/protecting-children/new-rules-for-a-safer-generation-of-children-online

[26]  Generative artificial intelligence (AI) in education - GOV.UK

https://www.gov.uk/government/publications/generative-artificial-intelligence-in-education/generative-artificial-intelligence-ai-in-education

[27] [28] [29] [40] Thousands of children with SEND to benefit from assistive tech - GOV.UK

https://www.gov.uk/government/news/thousands-of-children-with-send-to-benefit-from-assistive-tech

[30] [31] [32] [33] EEF_Digital_Technology_Guidance_Report.pdf

https://d2tic4wvo1iusb.cloudfront.net/production/eef-guidance-reports/digital/EEF_Digital_Technology_Guidance_Report.pdf

[35] Digital Tech (Digitech) Library | Rochdale Borough Council

https://www.rochdale.gov.uk/libraries/digitech-digital-tech-library

[36] What Is The National Databank | Free Mobile Data For Digital Inclusion | Good Things Foundation

https://www.goodthingsfoundation.org/our-services/national-databank

Kostakis Bouzoukas

Kostakis Bouzoukas

London, UK