Artificial Intelligence (AI) is now embedded in decision-making across education, healthcare, agriculture, and finance. While much of the innovation comes from high-income countries, these systems are being exported—through products, platforms, or policies—to the Global South. This raises a critical question: What does it mean to design AI systems that are truly inclusive in contexts with limited data, uneven digital access, and different social norms?
This article explores the principles of inclusive AI design in emerging markets, highlights common pitfalls, and presents practical approaches that show promise.
1. Contextual Relevance
AI systems must reflect the environments in which they operate. What works in New York or London cannot automatically work in Nairobi or Dhaka. Contextual relevance means designing tools grounded in local languages, infrastructure, and user behavior.
For example, Google introduced a voice assistant in Hindi and regional languages for its payment app Tez (now Google Pay) to reach non-English speakers (Google India Blog, 2018). Similarly, Digital Green uses videos in local dialects and cultural contexts to make agricultural advice more engaging for rural farmers (Digital Green, 2021).
2. Participatory Development
Too often, AI systems are built in silos by technical experts who lack insight into end-users’ lives. Participatory development means co-designing solutions with communities, not just for them. This approach improves trust, relevance, and often uncovers needs that outsiders might miss.
Organisations like IDinsight and Nesta have tested participatory machine learning, where communities help shape data, model objectives, and even interpret outputs (IDinsight, 2022; Nesta, 2020). This is especially important in public services, where power imbalances can easily deepen marginalisation.
3. Equity by Design
Inclusion should be embedded from the start, not added later. This means setting equity goals early, such as ensuring fair outcomes across groups or building fairness constraints into models.
A case in point is Kenya’s digital lending platforms. Many credit scoring models penalised users who shared devices—a common practice in low-income households—because the system misinterpreted usage patterns (CGAP, 2020). A fairer design would factor in such realities at the training stage.
1. Biased Datasets
AI models often rely on datasets from the Global North, which fail to capture social, economic, and linguistic diversity in other regions. This bakes bias into the model itself.
Facial recognition is a clear example. The Gender Shades study by Buolamwini and Gebru found commercial systems had error rates of up to 34% for dark-skinned women, compared to less than 1% for light-skinned men (2018).
2. Opaque Black-Box Models
Complex, opaque models in healthcare, finance, or criminal justice pose ethical risks. In the Global South, weak regulation and low digital literacy magnify these risks. Without transparency, people harmed by automated decisions often have no way to seek recourse.
Tools such as Model Cards and Explainable AI (XAI) can improve accountability, but they are still underused in emerging markets (Mitchell et al., 2019).
3. Technology ≠ Opportunity
Technology alone does not create opportunity. Access to a device does not guarantee meaningful outcomes if barriers like literacy, affordability, or gender norms remain unaddressed.
In rural Bangladesh, women were found to be 33% less likely than men to own a mobile phone, limiting their access to digital health or financial services (GSMA, 2021). Availability does not always equal agency.
For AI to truly serve the Global South, it must be purpose-built rather than retrofitted from Western contexts. Every step—from data collection to evaluation—should reflect local realities, needs, and rights.
Inclusive AI is not only a moral imperative but also a practical one. Systems that ignore context or exacerbate inequality are bound to fail. But with care, humility, and collaboration, AI can drive inclusive growth—not just technological progress.
This article was authored by Parisa Omar, Business Consultant. For further clarifications, contact here: [email protected].
Our experts can help you solve your unique challenges
Stay up-to-date with our Thought Leadership and Insights