By Ania Mendrek FIEP | Managing Director | Crafting Civic Change
Digital transformation is reshaping public employment services worldwide. Automated tools and AI-driven platforms promise greater reach and efficiency, yet most are built for a “generic” user ignoring the differentiated needs of real communities.
As Microsoft’s Inclusive101 Guidebook (2020) warns, designing for the average risks missing those on the margins. Soares Guedes & Szlavi (2023) highlight that a lack of diversity in tech design and decision making often results in unintentional bias, blind spots and the exclusion of different perspectives, increasing the likelihood that digital services will fail to serve everyone equally.
Digital exclusion is now recognised as a defining divide of the twenty-first century. Despite the ubiquity of digital tools, both access and ability to benefit remain deeply unequal, shaped by gender, migration status, education, income, disability, and geography.
As the Equitable AI Framework (Studio INTO, 2024) makes clear, “digital transformation is never neutral.” Without deliberate inclusion, new technologies can reinforce and even amplify existing forms of bias and exclusion, especially when those most at risk are not involved in design.
Closing the gap requires rethinking how digital tools are designed, governed and evaluated for their real-world impact, making digital inclusion central to achieving gender equality. (UN SDG 5).
Figure 1: Who we design for (if we use “self-as-user”) and who we exclude
Despite promises of universal access, digital employment services can reinforce or widen inequalities, especially for women, single parents, migrants, and carers.
Recent research from the Nordic region finds that immigrant women are especially at risk of “double jeopardy,” experiencing both digital and social exclusion (Nordregio, 2024).
Barriers range from lack of devices and connectivity to low digital literacy, but also include trust, confidence, and cultural adaptation.
CASE STUDY: DELIBERATE OUTREACH AND DIGITAL INCLUSION – LESSONS FROM THE NORDICS AND LONDON
Promoting digital inclusion for migrant women
Across the Nordics, digital strategies increasingly target the exclusion risks faced by immigrant women.
Countries like Norway and Sweden now make immigrant women a priority in digital policy, moving from rhetoric to action funding targeted services, mapping support, and involving users in design.
What’s working
- Tailored support: Childcare and multilingual digital skills courses lower participation barriers.
- User involvement: Women and frontline workers shape programmes from the start.
- NGO leadership: NGOs play a pivotal role by co-designing digital literacy courses with immigrant women, fostering a sense of ownership and relevance. Denmark’s “digital ambassador” model goes further, training local women to become digital guides within their own communities and demonstrating a ripple effect by empowering women extending the reach of inclusion efforts.
- Safe and supportive environment: Gender-specific support groups foster a sense of security and openness, enabling women to share their experiences, overcome challenges, and learn IT skills in a setting where they feel understood and supported, and can progress at their own pace.
There’s ongoing debate over targeted versus universal approaches, but the consensus is that truly inclusive services must go beyond digital by default by prioritising co-creation, multilingual access, and multiple channels.
Belina Grow: empowering women through digital and human-centred support
Many economically inactive women face overlapping barriers to work, from childcare and low confidence to digital exclusion. For those under Universal Credit work requirements, pressure to secure a job often comes without the tailored support needed to make it sustainable.
Belina Grow blends trusted human outreach with accessible digital tools to build skills, confidence, and connection:
- Seamless onboarding: A third-party app (Earlybird.AI ) guides women through registration, form-filling, and note-taking during meetings, with staff support at every step.
- The Grow App: A private, Facebook style community exclusive to participants, where they can view training schedules, join forums, message peers, and connect with the Belina Grow team.
- Flexible access: With smartphones and internet access, participants can attend webinars, courses, and coaching sessions from home, fitting learning around childcare and other commitments.
- Peer learning & support: The GRoW App fosters community, reducing isolation and enabling women to learn from one another.
- AI for employability: Practical training on tools like ChatGPT helps participants tailor CVs, prepare for interviews, and explore job options – alongside guidance on safe and ethical use.
- Human-led engagement – using AI tools and digital delivery frees time for trust-building, tailored guidance, and outreach.
Belina Grow’s model reflects global best practice by supporting often marginalised and excluded women to access opportunities and overcome barriers to work, applying AI “for humans, not instead of humans” to enhance rather than replace personal support, removing access barriers in ways that align with modern workforce expectations, and maintaining the agility needed to adapt quickly in a constantly shifting labour market.
Digitalisation of public employment services is far more than a technical upgrade; it fundamentally reshapes how support, unemployment, and activation policies are delivered. Unless these digital services are intentionally designed with gender and intersectionality in mind, they risk reinforcing or deepening existing inequalities. As the World Bank puts it, the question is not whether digitalisation creates new risks, but whether we are willing to design out exclusion through intentional, user centred approaches (World Bank, 2022).
THE DIGITAL GENDER DIVIDE
Women and girls face persistent, layered barriers to digital participation worldwide including in high-income countries. Globally, women are 20% less likely than men to use mobile internet (UNICEF, 2023; ITU, 2022). In the EU, women lag men in advanced digital skills (European Parliament, 2018). Barriers extend beyond poverty or connectivity: social roles, care work, migration status, language, and safety all matter. Many women share devices, have privacy concerns, or encounter poorly designed interfaces. Migrant and minority women and people with disabilities are at special risk if their needs are not addressed (Soares & Guedes, 2023; Microsoft, 2020). Studies show digital literacy training is essential; without it, full digitalisation reforms can worsen, not narrow, inequalities.
The stakes are high: as digital literacy becomes a core employability skill, in a world where more than 90% of jobs have a digital component (UNICEF, 2023), women risk being excluded from opportunities in work, learning, and participation, reinforcing old patterns of disadvantage in new, digital forms. Safety and trust are also critical. Women are disproportionately affected by online harassment, abuse, and threats – deterring many from digital public services. Safeguarding, privacy, and safety-by-design must be core requirements, not afterthoughts. As often noted, “digital transformation moves at the speed of trust.”
Leaving these gaps unaddressed risks reinforcing, rather than redressing, longstanding inequalities, resulting in higher drop-out rates, lower engagement, and missed opportunities for those most in need. At a societal level, it risks hard-wiring bias into the very systems meant to expand opportunity. As the World Bank (2024) puts it: “Gender equality for all people is a matter of fairness and justice. It is also essential for development. Growing evidence shows how removing gender barriers unlocks economic productivity, reduces poverty, deepens social cohesion, and enhances well-being and prosperity for current and future generations.”
WHEN SERVICES GO DIGITAL: WHAT GETS MISSED
Much attention in policy and media has focused on the digital skills gap, often framing it as a matter of getting more women into tech roles or improving digital confidence. While important, such framing risks missing the more complex, intersecting barriers that millions of women face, such as caring responsibilities, single parenthood, poverty, or language challenges as migrants. These realities shape not only access but also how women can engage with digital platforms. Many digital services unintentionally penalise women through inflexible forms, appointment systems that don’t accommodate care duties, or AI tools that assume linear, uninterrupted careers, or overlooking skills gains in non-traditional settings.
Blind Spots in Sector Reports and Practice
Recent major reports on AI in employment support, such as Employment Support Services in the AI Era: Insights, Lessons and Advice from Leaders in the Sector (Earlybird 2024) and AI in Employability: Opportunities, Challenges and the Road Ahead (ERSA & Hudson & Hayes, 2025) express significant optimism about the potential of AI to make employment services more efficient, personalised, and scalable. They highlight gains such as freeing staff for more meaningful engagement and using digital tools to reduce friction for jobseekers, important advances in efficiency, productivity, and reach.
However, reflecting a broader pattern in sector-wide literature, both reports are largely silent on gender and intersectionality, with little substantive analysis of how AI powered employment services might systematically disadvantage women or other marginalised groups. This is not necessarily because these issues are dismissed, but it may reflect that gender-specific considerations are not yet systematically embedded in commissioning briefs, programme mandates, or evaluation frameworks. This means those matters are less likely to surface during evidence gathering or stakeholder consultation. As a result, critical issues such as safeguarding, online harassment, and privacy, particularly relevant for women rarely feature in sector-level discussions, programme design and analysis.
Neither report, nor much of the wider literature, goes into depth on how AI in employment services might inadvertently amplify gender bias, nor on measures such as collecting gender-disaggregated data, user testing with diverse women, nor evaluating gendered outcomes. Inclusion does feature in the reports, but it is often framed in terms of general technical solutions, such as language translation or basic accessibility, rather than a deeper engagement with the overlapping realities of gender, migration, care responsibilities, class, and digital confidence. Without explicitly embedding dimensions such as gender, migration, care responsibilities and digital confidence into the design and evaluation of AI-driven tools across the sector, there is a risk of obscuring the nuanced ways in which innovation can unintentionally reinforce existing exclusions. This is especially true for those already at the margins, where barriers are shaped not only by digital access and skills but also by life-course patterns and social context. Exclusion in digital employment services isn’t always about lack of access or skills alone. It also results from a lack of user voice in design, “one-size-fits-all” digital tools that miss non-standard life courses, and systems built around what is easy to measure rather than what is actually needed, like wraparound support, affordable childcare, or tailored reskilling for women returners and other vulnerable groups.
CASE STUDY: DOUBLE EXCLUSION IN IRELAND’S DIGITAL PUBLIC EMPLOYMENT SERVICES
Drawing on the work of Antoinette Kelly and Zach Roche (in Digital Public Employment Services in Action), this case study illustrates how digital transformation can unintentionally deepen exclusion for those facing the highest barriers to employment often including women, migrants, and carers or persons living with disabilities.
The research shows how Ireland’s shift to “digital by default” public employment services, intended to boost efficiency, made support less accessible for long-term unemployed jobseekers. Many struggled with digital tasks, lacked internet, and faced impersonal systems. Vulnerable groups were most likely to be excluded, with caseworkers unable to provide tailored support. Screen-level bureaucracy replaced street-level discretion.
While digitalisation improved efficiency, it did so at the expense of equity, leaving skilled caseworkers underused and many clients unsupported. Kelly and Roche conclude that to avoid such outcomes, hybrid approaches, targeted in-person help, and systems designed for complex needs are critical.
CASE STUDY: AUSTRIA—WHEN DIGITAL INNOVATION REINFORCES GENDER BIAS
Austria’s public employment service (AMS) has faced repeated criticism for how digital tools reinforce, rather than reduce, gender inequality.
Paola Lopez (How Fair is the AMS Algorithm?, ITA DOSSIER NO. 52EN, 2021) research and a subsequent fairness audit found that the AMS algorithm (introduced in 2018) systematically penalised jobseekers with fragmented or interrupted work histories, assigning nearly 29% of these candidates in Vienna to the lowest support group. Being female could trigger an automatic point deduction, and factors like age and nationality further compounded exclusion risks. The system misclassified around 30% of jobseekers, sometimes placing women in the “high employability” group, which paradoxically reduced their support. Attempts to “de-bias” the algorithm achieved only minor improvements, often at the expense of overall performance.
In 2024, AMS’s ChatGPT-based AI career chatbot, “Berufsinfomat,” became infamous for gender bias, suggesting IT jobs to men and Gender Studies to women, despite attempts to instruct it otherwise. The chatbot also hallucinated answers and shifted privacy risks onto users. The rushed launch of the AMS chatbot reinforced gender stereotypes, lacked transparency, and highlighted the dangers of deploying advanced digital tools in public services without robust oversight and safeguards. Experts noted the opacity of language models and emphasised public agencies’ accountability for the systems they deploy. (netzpolitik.org 2024).
Both cases highlight the risks of deploying digital tools without robust oversight, user involvement, or safeguards.
THE REAL-WORLD CONSEQUENCES OF DIGITAL OVERSIGHT
Despite their promise, digital and AI driven employment services frequently overlook the realities of women and marginalised groups. Too often, these systems are designed for the “average user,” missing the value of care work, portfolio careers, and non-traditional employment histories that are especially common among women, migrants, and carers.
Barriers are intensified for those facing multiple disadvantages, such as poverty, migration, disability, and low digital confidence, that simple data points cannot capture.
Algorithmic bias is a persistent risk: screening tools and eligibility systems penalise non-linear careers, part-time work, or care-related gaps, while opaque decision-making leaves individuals unable to contest outcomes. Safety and privacy concerns, especially acute for women and other vulnerable groups, are rarely central in system design. Instead, inclusion is too often reduced to technical fixes, rather than genuine engagement with intersectional realities.
The cost is substantial. Digital transformation, if not deliberately inclusive, risks embedding and amplifying the very inequalities it aims to solve.
As the World Bank warns, “technological innovation tends to amplify rather than reduce existing forms of exclusion and bias” unless it is proactively designed and evaluated for equity impacts. If we fail to ask who gets missed, we risk hard-wiring exclusion into our future public services.
CONCLUSION AND RECOMMENDATIONS
Digital transformation in public employment services will only fulfil its potential if inclusion is intentional and continuous at every stage. This requires:
- Embedding gender and intersectional analysis in all stages of system design, testing, and evaluation.
- Co-creating solutions with diverse users, ensuring that the voices of women, migrants, carers, and those with disabilities inform development.
- Prioritising user-centred design and multiple access channels, not just digital-by-default pathways.
- Collecting and using gender disaggregated data to monitor outcomes and guide improvements.
- Building in privacy, safeguarding, and anti-harassment features from the outset, not as optional extras, but as requirements for equity and trust.
- Maintaining opportunities for in person, hybrid, or tailored support for those unable to access or benefit from digital systems alone.
Only by committing to these principles can digital public employment services expand opportunity, rather than entrench inequality, for all.
REFERENCES
Earlybird. (2024) Employment Support Services in the AI Era. [Report].
ERSA, Hudson, & Hayes. (2025). AI in Employability: Opportunities, Challenges and the Road Ahead.
European Parliament. (2018). Women in the Digital Age. Directorate-General for Internal Policies.
GLMLIC (Gender, Labour Markets and Poverty in LICs programme). (2023). Smartphone Access and Women’s Digital Inclusion in India: Results from a Large-Scale Field Study.
ILO (International Labour Organization). (2025). Technical Guidance Note on Digital Public Employment Services.
ITU (International Telecommunication Union). (2022). Facts and Figures 2022: Measuring Digital Development.
Kelly, A., & Roche, Z. (2023). “Digitising Exclusion: The Challenges of Modern Unemployment and Public Employment Service Delivery.” In: Digital Public Employment Services in Action.
Lopez, P. (2021). How Fair is the AMS Algorithm? ITA DOSSIER NO. 52EN, January 2021. Vienna: Institute of Technology Assessment.
Microsoft. (2020). Inclusive101 Guidebook.
Nordregio (Nordic Welfare Centre & Nordregio). (2024). Digital Inclusion of Immigrant Women in the Nordic Countries.
Soares Guedes, A. & Szlavi, A. (2023). “Gender Inclusive Design in Technology.” In: Proceedings of [conference or publication details—insert if available].
Studio INTO. (2024). Equitable AI Framework.
UNICEF. (2023). Bridging the Digital Divide: Girls’ and Young Women’s Digital Inclusion.
World Bank. (2022). Are You Willing to Design Out Exclusion? Digital Inclusion and Public Service Delivery. [Blog or report—clarify source if needed].
World Bank. (2024). Gender Strategy 2024–2030: Accelerate Gender Equality for a Sustainable, Resilient, and Inclusive Future.
ABOUT THE AUTHOR
ANIA MENDREK FIEP | Managing Director | Crafting Civic Change
Ania Mendrek is an international labour market and policy development expert with over 19 years’ experience advising governments, international institutions (ILO, ADB, European Commission, World Bank) and public, private, and third sector organisations across Europe, MENA, and Asia.
She specialises in designing inclusive, gender-responsive employment policies and services, amplifying the voices of vulnerable jobseekers, and aligning programmes with policy priorities and emerging technologies.
Her work bridges strategy and practice, from large-scale reforms to hands-on service design, with a particular interest in the intersection of gender, technology, and employability.