By Ania Mendrek FIEP, Managing Director at Crafting Civic Change

Original published by Ania Mendrek on LinkedIn

Over the past few years, we’ve seen an explosion of AI-powered tools designed to improve employment services and job coaching. While these innovations sound exciting, many end up feeling like a square peg in a round hole. Why? Because too often the technology comes first, and the service delivery context is considered second. Having had a fair amount of exposure to those various tools, I can’t help but offer my 5 cents…

In reality, the starting point should be the problem you’re trying to solve. Mapping the user journey, pinpointing gaps, and genuinely understanding how frontline staff work are all essential. If those pieces aren’t in place, an AI tool—no matter how sophisticated—won’t integrate smoothly into day-to-day operations. Below are a few approaches to ensure AI complements rather than disrupts your service design.

Embed the tool in the service design ?

A great AI application isn’t something you bolt on at the end. It should be woven seamlessly into the service journey. Start by identifying pain points (for example, time-consuming admin tasks) and see how an AI solution might solve these specific issues. If there’s no clear place where the tool adds value, it won’t stick.

Involve frontline staff ?

New technology can intimidate the people who deliver the service. They might worry about workload changes, bias in the tool, or even job security. By involving them early in the design and testing stages, you’ll gain practical insights and ease fears. When staff see the tool making their jobs easier—freeing them up for more meaningful face-to-face support—they’re far more likely to embrace it.

Demonstrate clear value ?

it’s not enough to say an AI tool is cutting-edge or more efficient organisations need proof that it delivers results. Maybe it can reduce admin time by 50% or boost user satisfaction by improving personalisation. If you can back up these claims with data, it’s easier to build a business case, justify costs, or secure a new budget line.

Address bias and inclusivity

Frontline staff rightly worry about bias, especially when you consider diverse groups such as neurodiverse clients or those who face additional barriers. Be prepared to test and audit the tool rigorously. Don’t just rely on the vendor’s claims—check how the tool performs with different demographics. Ensure human oversight remains part of the process and be transparent about how you’re addressing any inequalities that arise.

Scale gradually and evaluate ⚖️

Rather than rolling out AI across the entire service overnight, pilot it with a smaller group of staff and service users. Gather feedback, measure impact, and refine the tool as you go. Not only does this reduce risk, but it also generates valuable insights that can help shape wider implementation.

Align funding structures ?

Tie cost to measurable outcomes—efficiencies, frontline staff having more quality time supporting their job seekers, better user engagement, or improved quality of job placement—so it’s easier to justify the investment.

Engage commissioners: ensuring alignment and reassurance

It’s also important to bring commissioners—those who hold the purse strings and set strategic priorities—into the conversation early on. They need confidence that any AI tool will enhance, rather than replace, the human support that underpins successful employment outcomes. In many cases, commissioners want evidence that the tech delivers clear benefits to citizens while still meeting requirements around inclusivity and safeguarding. Engaging them from the start also means you can align the technology’s capabilities with funding criteria and performance targets, ensuring a smoother path to adoption.

Final thoughts ?

AI has real potential in employment services. It can streamline repetitive tasks, provide personalised recommendations, and free up staff to spend quality time with job seekers. But it should never be dropped in without proper context. Start with the service design, involve the people on the ground, address concerns about bias, and make sure the financials stack up. AI is best seen as an enabler, not a standalone fix for disconnected processes. When introduced thoughtfully, it can genuinely enhance how we support job seekers and help more people into meaningful employment.

Share via
Copy link