This website uses cookies to store information on your computer. Some of these cookies are used for visitor analysis, others are essential to making our site function properly and improve the user experience. By using this site, you consent to the placement of these cookies. Click Accept to consent and dismiss this message or Deny to leave this website. Read our Privacy Statement for more.
Jobs Board | Print Page | Contact Us | Sign In | Register
News Desk: Industry News

Reviewing the STEP UP Evaluation Report: Learning from Failure

10 April 2019   (0 Comments)
Posted by: Heather Ette
Share |

Reviewing the STEP UP Evaluation Report: Learning from Failure

The Learning and Work Institute has produced an evaluation of the Step Up project, which trialled ‘new approaches supporting low-paid workers to progress their careers’. https://www.learningandwork.org.uk/resource/step-up-trialling-new-approaches-supporting-low-paid-workers-to-progress-their-careers/

Unusually, the evaluation is thorough and honest about the failures of the project, reporting that it ‘was not able to show a significant positive impact…compared to what would have happened in its absence’. It reports ‘the costs outweighed the fiscal benefits’, and using a cost-benefit analysis, but not considering the cost of substitution in the labour market, they report that there could be a return on investment of 2.48, but it would require an increase of £4.66 per person in weekly earnings for each of 40% of participants. The actual recorded average increase was £1.01 per week.

In 2006, the evaluations of the  Job Retention and Rehabilitation Pilots likewise reported ‘…the interventions did not have an impact on any of the three treatment groups.’ This earlier report used a more robust evaluation method (randomised controlled trials), but the L&W Step Up report is methodologically good too.

We can learn as much from failure as from success, if we recognise the importance of these and other findings. They tell us that employment counselling and return-to-work support are not pre-destined to succeed. By contrast with them, there is good quality evidence about employment programmes that have succeeded, and from them inferences can be drawn about the ‘active ingredients’ in the successful projects.

But let’s ask why could these failings happen.

First, and reported in both cases, the circumstances of the beneficiaries / customers are complicated, with many barriers to working. Though universally acknowledged, that is no reason to give up. We are here to help those who need it. Conclusion: must do better!

Then comes a question of whether opportunities actually exist. In job retentions, they certainly do. In promotion or advancement, it is not so clear, though a reasonable assumption that some people – not all – will achieve advancement and higher pay. Why not all? Because, these jobs are fewer, demand more, and competition for them is higher. That is not to say they don’t exist, nor that they always bring great rewards.

What about people who work longer hours, change jobs or do more than one job? This doesn’t involve the same degree of competition, though personal barriers may still be important.

In either case, the resolution of personal issues, and the ability to compete for limited work opportunities are important factors. And these are no different in principle from the factors that affect transition from unemployment to work. We ought, if we are any good, to be able to help.

Do employers co-operate? Some do, some less so. There’s no escaping that they are part of the solution, and only sometimes part of the problem. Having the ability, time and resources to work with them as partners is going to be valuable; not having them cannot be as successful as we wish.

On this view, making the reasonable assumption that there are opportunities to be taken, even if not as many as we would like, nor as good, we are left with these principles:

  • We are trying (we think) to do the right thing and help people in need.
  • We think we could do better: the task is not impossible!
  • We can improve the quality of what we do.
  • We can improve the amount of what we do.
  • We can fill gaps in provision and co-ordinate better.

In each case, as is traditional, we could argue for more resources: more people, more time, more services. It’s a fair point, but it isn’t well supported by measurement. How much more would make a worthwhile difference? Governments have tried to escape the difficult question by using outcome payments regimes. The results have not been exciting, and the debate produces more heat than light.  Since we don’t know (or ignore, or don’t read the evidence) enough about what works, we cannot argue for better resources or better contract specifications.

We also can see that the discontinuities between providers, contracts, service designs and client groups are unhelpful. But they all exist for some reason, good or bad. I suspect that a perfect world of smoothly co-operating agencies will be hard to achieve, and will carry costs. Part of the issue is that black-box commissioning and supply-chain delivery, which places responsibility in the hands of contractors has not delivered on this score, as is reported by the Step Up study and other studies too. It’s a bit like asking drivers to design their own car while they drive it: nice in theory, but impractical.

Which brings us to the work of IEP members and affiliates. Do we do it right, do we do enough?  The Step Up evaluation repeats what we have known for many years, that the relationship between advisor and customers and employers is central to success. This is attested by all the well-conducted evaluations of successful programmes. But yet again, we do not get information about what a good relationship looks like. The science and craft of employment advice is not evaluated, but is replaced with generalities, such as ‘Action plans were developed from …. an in-depth discussion about the participant’s prior experience, current needs and future aspirations. This information….(was) …. used to shape the intensity and type of support provided.’ That tell us very little about what is meant by ‘Action plans’ ‘in-depth’, ‘intensity and type of support’. To go forward from here, we need to set out and to test measurable behavioural standards and guidelines. Only then will we really know what works.

Our own studies have made  some progress towards standards and towards measurement methods, and we are happy to share those with anyone. There is very much more to be done. I suspect that until we get away from the infamous black box, and into an acceptance that good practice needs more detailed study, sharing of methods and data, and precision in measurement, we are condemned to remain in a murky world of doubtful results, marketing hyperbole and nice-but-vague generalities.

Anyone for ‘glass-box consulting’?

David Imber PVRA FIEP is Principal of The Good Employability Company. The company exists to enable practitioners to deliver equality of employment opportunity. This is achieved through Research, Consultancy, Training and Evaluation services. We bring together professionally qualified work-psychologists, respected vocational advisers, highly valued trainers and strategic leaders with many years experience developing and delivering training, evaluating impacts, as well as providing counselling and employability support to improve outcomes.

www.good-ec.com

david.imber@good-ec.com