In December, doctors at a Veterans Health (VA) hospital in Oregon decided to admit an 81-year-old patient. He was dehydrated, malnourished, plagued by skin ulcers and broken ribs - in the medical professionals' opinion, he was unable to care for himself at home. Administrators, however, overruled them.
Was there no bed for this poor man? No, the facility had plenty of beds; in fact, on an average day, more than half of the beds are empty, awaiting patients. Was there no money or medicine to care for him? No, and no.
Reporting by the New York Times suggests that Mr Walter Savage was, perversely, turned away because he was too sick. Very sick patients tend to worsen the performance measures by which VA hospitals are judged.
If this had happened in isolation, we could simply gape at the monstrosity that bureaucracies are occasionally capable of. But such examples abound in healthcare across the United States. For example, in the 1990s, New York and Pennsylvania started publishing mortality data on hospitals and surgeons who did coronary bypasses. The idea was that more informed consumers would steer themselves towards the teams with the better statistics - theoretically good for patients, bad for slacking providers. The reality, however, was less ideal: In those states, surgeons seem to have started doing more operations on healthier patients, while turning away the sickest ones who might otherwise have benefited.
From this we can take a few lessons. The first is one that has been well-known to other sorts of businesses: What you measure is what you get, not necessarily what you want. In fact, if your measurement is badly designed, you may get a great deal of something you don't want.
To illustrate that, look at Wells Fargo, which recently paid a whopping fine because a badly designed compensation system encouraged low-level staff to muck around with customer bank accounts. These machinations generated effectively no revenue for the bank, and annoyed customers, but they did generate income for the employees - and eventually, a stinging, expensive rebuke from the Consumer Financial Protection Bureau.
Or take an example from my own early employment history: I once temped for a firm where some overzealous office manager had decided to crack down on office supply leakage by issuing an edict that employees could take only one pen, notebook and so forth at a time. To be issued another from the locked supply room, you had to show the one you'd used up.
Did this save the firm money? Well, it spent less on pens. But as you can imagine, eventually someone lost a pen. Naturally, they stole someone else's pen. That person then began prowling neighbouring desks for a replacement. By the time I arrived, normal business activity seemed to have made way for full-time careers in petty theft. Even at minimum wage, it seems unlikely this expenditure of human resources was a net gain to the firm.
I could reel off examples endlessly: purchasing managers who have cosy arrangements to buy a certain amount of product from their vendors in December, and ship it back in January, in order to help some sales director make quarterly targets... universities that compete to turn away as many students as possible, because doing so makes them rise in the US News rankings... law schools that hired their own graduates for temporary make-work jobs in order to boost the schools' employment statistics. All metrics will be gamed, and the games always have costs. And when the metrics involve our health, those costs can be very high indeed.
Which brings me to the second lesson we can draw from this experience: Healthcare is particularly ill-suited to management-by-measurement.
It's no accident that so many of the bad examples I offered above come from healthcare and education. Most companies are dealing with reasonably standardised inputs, which can be turned into measurable outputs. But the less you deal with things, and the more you deal with human beings, the less useful productivity metrics are. Human bodies and human minds are both highly variable and immensely complicated. When you are working on them, it is hard to know how much of the final result is a result of your labour, and how much can be credited to the qualities of your initial starting material.
So when we measure outputs, we are getting at best a very distorted picture of the value of the services provided. Modern industrial management is simply not designed for this sort of situation. If you feed human inputs into a machine system, you are quite likely to grind up the humans in the process.
This should give us pause, because we are embarked on the great era of healthcare rationalisation. The cry of the wonks who designed Obamacare was that we wanted to pay for health, not treatment - outcomes, not inputs. That cry was echoed by pundits and politicians across the land. And who could dispute such a laudable, obvious goal?
Well, someone who noted that doctors don't actually have all that much control over outcomes, certainly far less than we would like them to. By trying to pay for something they couldn't control, we ensure that they will try to regain control by any means necessary. By trying to rationalise the system, we may easily make it less rational still.
• Megan McArdle is the author of The Up Side Of Down: Why Failing Well Is The Key To Success.