A conversation with a friend over brunch reminded me of a really thought-provoking book I reviewed a few years back, namely Gary Miller’s [book: Managerial Dilemmas: The Political Economy of Hierarchy].
The basic idea in Miller is that, if your organization (company, team, university, whatever) is judged on the basis of how it performs overall, then everyone has an incentive to slack and let everyone else do the work. But since everyone is subject to the same incentives, everyone slacks and the whole thing goes to shit (technically speaking).
Likewise, your company has every incentive to screw you and not, say, invest in educating you; after all, why pay for you to get a master’s degree if you’re just going to take their investment to another job?
Somehow both sides need to agree to disarm: the company needs to credibly (and in a certain sense irrationally) signal that it’s going to support its employees, even though it has no guarantee that they’ll reciprocate; and employees need to credibly (and in a certain sense irrationally) signal that they won’t slack, even though they have no guarantee that the company will reward them.
Turns out it’s a hard problem. I don’t recall Miller talking about this at all, but it seems clear to me that government has a role to play here: since companies can’t be trusted to supply me with a pension that will help me in my old age, let’s make Social Security really good. And since companies can’t be trusted to pay for my master’s degree, let’s have the government subsidize advanced degrees. There are obvious problems with this, but it’s not clear that they’re worse than the economy as she already works.
I think this is the logic behind Germany’s mittelstand. It pays for all the employers to have a trained workforce, but not for an individual employer to pay for it. It helps that the mittelstand companies are geographically clustered.
But the government already subsidizes advanced degrees – both by providing cheap student loans and through tax-advantaged status given to the universities that grant the degrees.
Of course, this subsidy is spread across people who get both productive and non-productive degrees. Whereas, in the limited experience I have with companies paying for their employees’ further education, they are mostly paying for very specific training or credentials i.e., a chemicals company sponsors some chemists to take a summer course in polymer synthesis at a nearby uni, or a send a chemist to get an MBA, with an eye towards them becoming a lab manager.
My point is mostly that companies will usually have a better idea of what they wish their employees could do than the government does. And the employee certainly has a strong idea of what advanced degrees they would like/are capable of getting. The government has failed to make clear any preferences about degrees obtained by individuals. So I suppose we could implement some scheme where the govt matches employer or employee money when used for training/further education, hoping that the employer and employee make a good decision about what degree the employee should get.
Although, employer-provided further education is partially tax-deductible [for the employer], so that’s more or less the same thing.
tl;dr: the gov’t seems to already do this
“…the company needs to credibly (and in a certain sense irrationally) … and employees need to credibly (and in a certain sense irrationally) … ”
Irrationally? There is a well known, historically proven, completely rational way to solve this problem – an apprenticeship contract. The young unskilled worker agrees to a five year or whatever contract. The employer agrees to both pay and train the worker. If the worker leaves early they have to pay a considerable penalty.
I’m not exactly sure why the apprenticeship system disappeared. The rise of subsidized college and legal college degree requirements for employment must have played a role. And also I think the legal environment may have changed to make non-competes and other aspects of an apprenticeship contract much more difficult to enforce.
That said, a variant of employers paying for further education does still happen. Investment banks and consultant shops will sometimes pay for an MBA if the worker promises to return to the company for five or six years. If they do not return, they must pay for the MBA itself. I think the reason this practice is not widespread is that most masters degrees just are not that valuable on the job. For me, a masters in computer science is not worth the two years of lost pay, even if the company subsidizes it. Nor do I have any interest in doing a full time program at night outside of work – a coworker did that and it was hell (and it wasn’t good for the company either, because he wasn’t as productive). If my current company paid for the worker to take two years off to do a masters, they would actually lose ground and be less valuable because they would not be in the line of fire playing with the latest tools (hadoop, hbase, puppet, etc, etc) for building scalable web apps.
From what I have read and observed, supporting apprenticeship seems far superior to subsidizing vocational programs and masters degrees. Vocational programs are quite expensive because you have to duplicate the tools of the work environment – it’s not cheap to build an autobody shop. The most affordable way for students to access to the tools is to actually learn in the work environment. And in some cases it may be impossible to get the right tools in the classroom – there is no way to simulate in the classroom the skills needed to manage a 30,000 server network. Furthermore coworkers and mentors on the job are much closer to the real problems of the job, while academics tend to be quite out of touch. Finally note that subsidies for college education end up getting priced into the tuition, so that the subsidy does not help the students ( I would be perfectly OK with a government policy that required that all universities that wished to access to loan subsidies and tax-free-non-profit status must have obey limits on tuition …)
When my girlfriend went through a masters program in architecture I aw a prime example of the ivory tower effect. A masters degree in architecture is required to become a licensed architect. The schools have abused this legal gatekeeper position to get away with teaching nothing of value, and to simply exist to provide sinecures for bloviating professors. They learned almost nothing practical in the program. All the structures, materials, contruction, and other practical skills are learned via self study after graduation or on the job. And the aethestics and design philosophy the professors instilled in the students, was, IMHO, a crime against the public. Compare the old Boston City Hall ( http://en.wikipedia.org/wiki/OldCityHall(Boston) ) to the new one. The old one was designed by two architects who had no college and a few years of college. The new one was designed by Columbia professors. Or for example look at this new prize building designed by the professors at my girlfriend’s university – http://architypesource.com/img/uploaded/projects/77/dormitory-erdy-mchenry-radian-univeristy-lg-08spg.jpg The professors were caught up in academic fads and paid no attention to how the building will look in its environment years after it is built when all novelty has worn away.
If you want a government solution to worker training, I would have a regulation that a) legalized and standardized apprenticeship contracts for any profession in a way that is fair to workers and companies and b) mandate (now that mandates are legal :-) ) that all companies from sole proprietorship to major corporation participate in the program and accept apprentices based on the size of the company.