© 2024 MICHIGAN PUBLIC
91.7 Ann Arbor/Detroit 104.1 Grand Rapids 91.3 Port Huron 89.7 Lansing 91.1 Flint
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Measuring the costs and benefits of retraining

Measuring the success of retraining programs used to be straightforward. You just looked at how many people got better paying jobs. Now the emphasis is shifting from how job seekers benefit to how taxpayers benefit too. That’s because some federal funds for workforce development are shrinking, and local agencies have to do more to make their case.

In the Midwest, we hear a lot about retraining. A lot of the money for retraining and other job services comes from the federal government, through the states, to local programs like this one in Jackson, Michigan.

Sparks fly as Ron Waldon grinds the surface off a steel block. Soon he’ll learn to be a CNC operator– someone who can program computerized milling machines. It’s a hot skill for a guy who’s had a rough few years.

“Aw man, ups and downs. I lost the job, lost the house,” he says.

Temporary work here and there. Nothing steady.

“Suddenly, you’re just not part of society anymore without a job,” says Waldon. “I know I’m not the only one who suffers from the fact that you lose that independence or that self-worth, I guess.”

Todd Debenedet is also retraining.

“You can only mow the lawn so many times, you can only walk the dog so much,” he says. “And getting back to work and being, like he said, a productive society member would be very important.”

Personally important, for sure. But what is the economic impact for the public? There’s a big debate right now about how effective workforce development programs are, how many there should be, and how involved government should get with budgets so tight. Which all led to a near-death experience for the main source of workforce development funding last year.

Ron Painter is CEO of the National Association of Workforce Boards.

“In the last budget cycle that was introduced, the House Republicans zeroed out the Workforce Investment Act. So that was a pretty clear signal that we had a lot of explaining to do,” he says.

It was a wake-up call for Christine Quinn, too.

“Well, after my stomach settled a little bit, I actually started saying, ‘What do we need to do?’” she says.

Quinn is president of South Central Michigan Works!, where 100 thousand people sought job services last year, from just three counties. Their Workforce Investment Act (WIA) funding survived, but not unscathed. And Quinn decided the old performance metrics – employment, wages, job retention — weren’t enough by themselves. She wanted a tool to show whether benefits to the public outweigh costs to the taxpayer.

“Somebody wants to see what that dollar value is,” she says. “It’s not necessarily touchy-feely, it’s not the fact that you see somebody get a job who has who has been struggling for so long, which is important. But we also have to have the hard data too.”

South Central Michigan Works! just released a benefit-cost analysis of their programs for 2009. It says every public dollar spent should generate $1.22 in benefits over a decade.

The state of Ohio may go even further. According to workforce development officials there, Ohio’s WIA funding has been cut almost in half over the last five years, a loss of about $80 million. (That’s not including a large, temporary influx of cash from the stimulus package.)

Ohio recently completed a pilot project measuring return on investment for part of its dislocated worker program. This kind of study analyzes not just wages earned and program costs, but also wages sacrificed while participating in the program and reduction in unemployment compensation afterwards. Results from the small pilot showed that participants recouped their investment after two years and taxpayers after five years. Development officials hope to expand the analysis into a longitudinal study.

When it comes to workforce development, return on investment studies can be difficult. Take it from Kevin Hollenbeck who’s done a bunch of them, including for Washington state.

“If I were administrator of the day, I would not do it by return on investment,” he says. “I understand I’m kindof shooting myself in the foot.”

Hollenbeck is vice president and senior economist at the W.E. Upjohn Institute for Employment Research in Kalamazoo. He says measuring return on investment involves a lot of assumptions about what would’ve happened to people if they had never encountered these programs. Like how much money they would have earned.

That means analysts and policymakers could end up comparing different things. He’d like to see more research on what kind of assistance really helps.

“In my work, we more or less treated the program like a black box,” he says.  “People came in, something happened, and then there was a result. And the ‘something happened,’ we really haven’t done a lot of research on what’s best for whom.”

As for outcomes, Kevin Hollenbeck says a rigorous, consistent measurement of plain old earnings is the way to go.

Related Content