5 JULY 2003, Page 20

If I should die, blame the top five performance indicators

T, he will to live is a wonderful if, at times, inexplicable thing. Mine deserted me midway through this assignment. It just went. I was logged on to the Audit Commission's website. After a moment or two, this thin, silvery-grey, wraith-like vapour exited through the top of my head, and suddenly I was half in love with easeful death. So that's the will to live gone, then, I thought to myself.

It had threatened to depart several times before. That's the problem with researching an article about 'performance indicators'. In the end, they'll smother you.

But there were laughs along the way, at least. My thesis — that performance indicators are either useless or counterproductive — took me to strange lands inhabited by alien beings, but I was able to compile for you my top five utterly fatuous performance indicators — examples which, even by the high standards of the genre, stood out proudly.

Let's start with the health service. You may have read about the performance indicators, or targets, that they have in the Department of Health. The doctors hate them and are in revolt. Just this week it was revealed that patients are banned from booking advance appointments with GPs so that surgeries can meet the bloody stupid 48-hour waiting-time target. That's what I mean by counterproductive: it has had, ineluctably, the opposite effect to that intended. The British Medical Association has described the performance indicators as 'obscene'. Its boss, Dr Ian Bogle, has suggested that doctors and managers have been forced to collude in the manipulation of figures to the detriment of patients. He is right — unequivocally so — but he should lighten up a bit. He ought to see the funny side.

Hundreds of performance indicators have been imposed on hospitals and surgeries and some of them are very humorous indeed, even if their effect is to make people more, rather than less, ill. Stuff like the Day Case Rate: 'This indicator shows the percentage of in-patients treated as day cases. This is one indicator of the effective use of resources. Casemix adjustment takes account of variation which can be attributed to differences in patients being treated.'

You'll be pleased to learn that the performance indicator for this has risen from 63.6 per cent to 64.9 per cent recently, according to a 'day-case rate for a basket of procedures'. But what, you might ask, will be the effect in the real world of an indicator that pressures hospitals into discharging in-patients very quickly indeed? Will it be a beneficial effect, in the end?

But this is not my favourite Department of Health indicator. My favourite — and straight into my indicator chart at number five — is the Data Quality indicator. 'This indicator provides a measure of the quality and reliability of data underlying many of the performance indicators and serves as a proxy for assessing general quality of data. . . .

Yes, that's right; you've got it. We glide smoothly into a profoundly surreal realm with a performance indicator for the performance indicators. And you'll be delighted to know that, once again, it went up last year. From 91.5 percent. To 91.6 per cent.

But how do they know that the performance indicator of the performance indicator is sufficiently reliable? Shouldn't someone set up a performance indicator for whoever is charged with the task of setting performance indicators for the performance indicators? Come on, you're getting a bit lax there, guys.

Let's move on. Let's have a look at the arts. And here we have a bunch of indicators dreamt up by the Audit Commission, the Local Government Association, the National Association of Local Government Arts Officers, the Chief Cultural and Leisure Officers Association and the Department for Culture, Media and Sport, in terrible collusion with each other. [never knew that half those people existed. But they do, they do. And they have come up with a vast sackload of fatuous, entirely non-indicative indicators, under 19 separate subheadings, designed to run localgovernment arts projects better. Beginning with this: 'The framework initially identifies standards of service provider and supports the flexible self-evaluation of arts services, locating them more securely within the framework for comparative performance assessment.'

I must confess, I haven't a clue what any of that means — in theory or in practice. And there's worse to come. My computer broke down trying to download all of their indicators. It just went zzzp and ppphht and the screen went blank. I tried to reboot the thing, but it went zzzp again in a resigned, plaintive manner, so I left it alone. I tentatively peeked around the door later, and it was quietly weeping.

I'd got as far as indicator number eight from section one of the 19 subheadings, which was: 'The service is accountable and inclusive with a clear sense of target groups (as defined by local priorities) and relevant and appropriate for local residents.'

Accountable and inclusive, huh. But no mention, anywhere, of drama, music, dance or film.

The will to live left when I checked out the performance indicators invented by the Commission for Racial Equality and the Audit Commission, a pretty lethal combination, I think you'll agree. There's a long and doleful and apparently non-satirical essay there about how local councils just haven't matched up to the four and a half billion indicators imposed on them. This, at least, was good news. Maybe they've failed because they don't understand a mission statement which begins with the words, 'Mainstreaming equality and diversity — integrating equality and diversity into dayto-day work and translating policy into practicemonitoring [their word] performance data.' Or frequent references to lop-level drivers', whatever the hell they might be.

There was one interesting little nugget buried away in this horrible document, though. They ranked the various tiers of local government according to how slavishly they'd complied with the performance indicators. The least compliant were district councils and the most compliant London boroughs. Then they polled local people about the councils they were most satisfied with. Guess what? The voters were most happy with the recalcitrant district councils and least happy with the London boroughs. Someone give me a statistician: I think we have a correlation here.

Here's another one for you. As a result of Gordon Brown's Public Service Agreement. the Foreign Office has a performance indicator that requires it to reduce the number of poppies grown each year in Afghanistan. There's someone with a clipboard in Kandahar even now. counting.

And — I promised you five — how about this: the Deputy Prime Minister's office requires local authorities to increase 'public participation' by 5 per cent. It does not say how or why or what it means by public participation. But I bet the councils report back and say yep, done it. We've increased public participation by 16 percent, would you believe it.

Meaningless when not actually damaging, redolent of an overweening desire for central control and choking in their fecundity, performance indicators are the authoritarian fetish of the decade. New Labour loves them. Any political party which promises to abolish them all — and the LibDems come closest on this score — will get my vote next time round.