AI in Education- An ally for inclusion, not a driver of division

“The lesson is the same, but the learners find it difficult.”

 

This single phrase from UNESCO’s AI in Education course has stayed with me ever since. It captures a quiet truth many educators in low-income or fragile contexts know too well , that access to learning isn’t only about technology; it’s about design for inclusion. True inclusion means designing for support, not uniformity.

 

As I went through UNESCO’s Digital Empowerment for Educators in the Age of AI course, several phrases resonated deeply with me. Each one sparked a moment of reflection on my work in education, particularly in low-income and fragile contexts. I decided to build this blog around those phrases, the ones that made me pause, question, and imagine what an inclusive, AI-integrated future for education could look like.

From Digital Divide to Digital Empowerment

For educators working in resource-limited settings, the digital divide isn’t just about connectivity or devices. It’s also about confidence, context, and control. A digitally literate educator can use tools. A digitally empowered educator can question, adapt, and design meaningful learning experiences with those tools. This shift matters. In classrooms where technology arrives faster than training, empowerment is what turns tools into transformation. AI can bridge divides, if we use it to support local realities, multilingual contexts, and teacher autonomy rather than to impose pre-designed systems.

Learning from the “History of AI”

The course took us through the evolution of AI from early rule-based systems to today’s generative and adaptive models. Every leap in technology has redefined what counts as “knowledge.” The key question isn’t “When will AI replace teachers?” but “When will AI challenge us to rethink what it means to teach and to learn?”

In low-income contexts, teachers are often more than content deliverers they are counselors, mediators, peacebuilders, and community connectors. AI, when thoughtfully integrated, can work alongside educators to deepen empathy and understanding, rather than merely making learning processes faster.

Bridging the Design Gap: Designing for Support

“The lesson is the same, but the learners find it difficult.”

This line, again, summarizes what many of us witness daily: a design gap.
Curriculum and platforms often assume uniform learners, but the lived realities of students  especially those in under-resourced or multilingual classrooms, are anything but uniform.

True inclusion begins when we design for support, for instance:

  • Adaptive pacing for different attention spans.
  • Voice-based content for low-literacy learners.
  • Multilingual feedback that respects local languages.
  • Emotionally intelligent prompts that build socio-emotional skills.

In Aflatoun programmes, these design principles could translate into using AI-driven storytelling to simulate real-life social and emotional situations, or reflective journaling tools that help learners build empathy and self-awareness. When AI becomes part of this design ecosystem, it doesn’t standardize learning, it personalizes it with care.

From Deficit to Asset-Based Models

Education is gradually moving away from deficit-based perspectives those that focus on what learners lack toward asset-based approaches that build on what learners already bring.

Frameworks like asset-based needs assessment and strength-based pedagogy remind us that learners are not empty vessels but holders of community knowledge, resilience, and creativity.

AI, when designed responsibly, can amplify this shift. Learning analytics can move beyond tracking completion rates to identifying patterns of strength, curiosity, and collaboration. They can highlight how learners support one another, what contexts motivate them, and where creativity naturally emerges.

The real question is whether we use analytics to support learning, control learning, or co-create responsible learning designs. As educators, our choice must remain with the first and third, empowering learners to see their own progress, challenge bias, and appreciate the value of their contributions.

From Learning Management Systems to Learning Experience Platforms

Another powerful distinction the course made was between Learning Management Systems (LMS) and Learning Experience Platforms (LXP).

An LMS (classroom) focuses on organizing content and tracking progress. An LXP(mentor) , on the other hand, curates journeys of learning personalized, dynamic, and co-owned by the learner.

For educators, this reframe is crucial. Instead of simply “rolling out” content, we can design learning ecosystems that blend AI with community learning spaces where learners explore peace, financial literacy, or entrepreneurship through reflection, storytelling, and collaboration.

Imagine youth in Palestine or Honduras connecting through an AI-curated platform that pairs them with peers from other regions working on similar social impact projects. That’s the potential of moving from management to meaningful experience.

Responsible AI: Asking the Hard Questions

The course constantly reminded us to ask:

“How am I using AI and who is excluded from it?”

In many of our partner schools, internet access is intermittent, devices are shared, and digital confidence varies. This reality calls for low-tech AI solutions voice-based chatbots in local languages, offline tools for teachers, or micro-learning modules accessible via SMS.

Responsible AI is not just about data privacy, it’s about ethical inclusion. It’s about ensuring that technological innovation does not deepen inequities but rather expands participation.

AI as an ally, not a driver

AI should never replace the teacher’s human touch, empathy, or contextual understanding. When AI drives learning, it risks erasing nuance and lived experience. When AI allies with educators, it enhances reflection, dialogue, and creativity.

In my work with Aflatoun, this allyship translates into exploring how AI can enrich programmes across peace, social, and financial education, from generating reflective exercises to curating context-sensitive learning journeys. The goal isn’t automation, it’s amplification of humanity through technology. So instead of asking “How can AI do this for us?” we ask “How can AI help us do this better, together, and more meaningfully?”

AI as a catalyst for reflection and growth

For me, this course wasn’t just about tools, it was about rethinking our relationship with learning itself. AI can serve as a mirror that helps educators see their own biases, experiment with new pedagogies, and reflect on learner diversity. It can help us document growth, not just in test scores, but in empathy, collaboration, and agency. As I integrate these insights into Aflatoun’s programmes, I see exciting possibilities:

  • AI-driven reflection prompts for young people exploring peace and identity.
  • Digital storytelling tools that let youth recreate local peace narratives.
  • Learning analytics dashboards that balance academic and socio-emotional growth.
  • AI-enhanced capacity-building pathways that prepare learners for employability, entrepreneurship, and lifelong learning in an evolving world.

AI, when guided by human values, becomes a catalyst for meaningful reflection and transformation for learners, educators, and systems alike. AI will never teach empathy, but it can help us create spaces where empathy grows. As educators working across diverse and low-income contexts, our task is not to resist AI but to redefine it. As I continue to explore AI-integrated pedagogy in my work, I hold myself accountable to a few guiding principles: inclusivity and equity, cultural sensitivity and non-bias, and transparency that keeps learner needs at the center. By grounding our use of AI in these principles, we can shape an educational future that is not only intelligent but also ethical, compassionate, and guided by human values.

 

Written by P.R Sreelakshmi – Education Specialist at Aflatoun International