3 JUNE 1989, Page 17

JASPER LUPO'S SPAGHETTI CODE

Donald Michie points out

the dangers of making computers impenetrable

FASHIONS in computer programming tend to reflect different social philo- sophies. Today it is the wholesome fashion to teach computer programmers to be super-disciplined. First the programmer defines the top-level problem — for example, how to decide when an expense claim is allowable. Then come the sub-problems which the program needs first to solve before it can solve the top problem — for example, does the claim come from a board member, an accountant, an execu- tive, a salesperson or an engineer? Each of these has its sub-sub-problems, perhaps to do with computer verification of board membership, of accountancy status, etc. This discipline, called top-down structured programming, looks the same whether the object is to vet claims or to monitor the signals from inside a jet engine. If you believe that society is, as I was taught in my youth, an even-handed hierarchy of com- mand and obligation, then the odds are that as a programmer you swear by this top-down style.

But perhaps society is not like this at all. Maybe it is an arena in which special- interest groups and ad hoc alliances com- pete and bargain with each other, uniting occasionally against this or that group such as the EEC, the Soviets or the Japanese.

Users of programming languages called 'object-oriented' will instantly get the pic- ture: a world of laissez faire — modelled by swarms of programmable abstract objects, each equipped with predictable properties and behaviours, 'inheriting' them to con- form each with its own swarm, the whole scene animated by messages flying to and fro among the busy objects.

More worrying, though, is a growing fashion in software, which I call 'anarcho- nihilism'. Here the belief is that if a program code is an impenetrable mess, then the guru who spun the tangled web has done well and must never be sacked. For then the employing institution would suffer the software equivalent of the Dark Ages, in which no one is left around who can understand anything any more, let alone maintain it. The prospect is terrify- ing: so the company does its best to keep its guru.

Let us now look upon the unacceptable face of this 'spaghetti code'. Possibly the majority of the world's software produc- tion is infected. It is not structured, not modular, often not even documented. What are its consequences?

Spaghetti code in the software industry is a far more serious public danger than salmonella in the poultry industry could ever be. I have in mind the monitoring and control of power stations, of large chemical plants, of air traffic flow, of military nuclear warning networks — in short any socially critical sector where a malfunction can precipitate situations of great jeopar- dy. It is at such times that overloaded engineers need to ask, 'How does this program work, and what at the moment is it trying to do?' Yet when such conditions do surface, operators, engineers and admi- nistrators find themselves stranded on the outside of a black box peering at hierog- lyphs.

The transcriptions of numerous disaster inquiries have amply documented this. In the period 1971-87, the Savannah River nuclear plant in South Carolina suffered between nine and twelve emergency shut- downs per year. Robert Keller, the De- partment of Energy investigator who vi- sited the plant after a recent emergency,

reported that 'the operating staff and advisers, knowing that they did not under- stand what was going on inside the "tank" . . . failed to [shut it down] . . . did not vigorously investigate, and they didn't care! This attitude is also a prelude to disaster, as they found at Three Mile Island, with the Challenger and Chernobyl.'

What was needed, of course, was the kind of computing system only seen in experimental laboratories — a system which can chat with the operators in a way which helps them to understand what is going on inside the tank. Several years before Keller's report, the EEC's Fast programme (Forecasting and Assessment of Science and Technology) enabled me to investigate and document my suspicion that something was going badly wrong with the computing end of some of these preca- rious technologies.

Why, for example, was there a catas- trophic fall of production at an automated steel plant in Hoogoven, in the Nether- lands? It turned out that the computer calculations were so deep and mysterious as actually to demoralise the operators. They became so unsure of themselves that they left unmanned the pulpits used for control. Turning from steel automation to nuclear power, some of the features of the Three Mile Island reactor caricatured an age-old attitude of engineer to user. From a 1979 summary published in Nature we learn that gauges were placed 'so high that an operator cannot read them without standing on a footstool or on a wall opposite the location of related control levers.' In an earlier epoch the high priest- hood had Latin as an even more effective way of veiling sacred information from the eyes of the laity.

Our report, Mismatch between machine representations and human concepts: dan- gers and remedies, was distributed in 1983 by the Fast agency, but without noticeable effect. Six years later one can only observe that episodes of dangerous misunderstand- ing between humans and machines con- tinue to increase. Our key recommenda- tion was: 'Certification should only be granted to systems which demonstrably augment the user's understanding of his task and its environment.' Otherwise acci- dents of the type observed at Three Mile Island will become common.

Operating staff would hardly lose motivation in the way described by Robert Keller if they could at any time just ask the system what it thought it was doing. But traditional software systems are strictly unaccountable. There is no facility to allow the user to ask the control program to give an account of itself in terms of its goal, and in general what it thinks it is doing. Not that developers find any particular difficul- ty in implementing facilities of this kind. They commonly provide them to ease their own work during development and testing — but strip them out again before deliver- ing their handiwork.

Why do they not include these saving features in the delivered goods? Because it is not in the implementation specification to which they work. Why is it not? Because the systems analysts don't include it in their specification, which says how in detail the final system is to behave. Why do they not include it? Because it's not part of the requirements specification from the client, which says in broad-brush terms what he wants. Why is it not part of the require- ments specification? Because the client usually has no idea that it is possible to build such accountability into complex software. So he doesn't ask for it.

By directing their procurement agencies to include the missing clause in the require- ments specifications for all new systems, the Defence Ministry, DTI, and other departments of government could strike a path back from the brink. One should not underestimate the collective clout avail- able. Fifteen per cent of the EEC's output is paid for by governments and public bodies.

What about costs? By comparison with the elimination of salmonella from the poultry industry, which Mr John McGre- gor has informed Parliament cannot be done at all, banishing unaccountability from socially critical software would be trivially cheap.

But software inscrutability is not likely to be treated, as it should be, on a par with more tangible, or more smellable and tastable, forms of pollution. Aluminium poisoning makes fish die and people ill. Climatic changes from greenhouse gases can be measured as they develop. But to see the poison of inscrutable software at work we have to wait long intervals be- tween rare catastrophes before we notice that these are becoming ominously less rare.

On top of all this, a new and more unsettling form of inscrutability is in the wings. A government agency, not a de- ranged corporation, leads the fever to unleash it; not, be it said, the British

Government, but the US Defense Depart- ment. The Pentagon's Defense Advanced Research Projects Agency (Darpa) is pre- paring to throw the sum of $390 million into a technology announced by Darpa's spokesman, Jasper Lupo, as being 'as important as the atom bomb', namely neural networks. A neural network is a cross between a computer person's idea of a nervous system and a brain scientist's idea of a computer. The result is a comput- ing system which shares with some brains and some computing systems the ability to self-modify. Unfortunately, when an in- itially obscure system starts to modify itself, it becomes more, not less, obscure.

Just why Darpa's Tactical Technology Office might see in neutral nets the future of machine intelligence is not clear. But in terms of sheer impenetrability, one similar- ity to the atom bomb can be conceded, namely the potential to introduce irre- coverable chaos into previously orderly situations. Jasper Lupo's military situation includes 'strategic relocatable target detec- tion from satellite optical and infra-red sensors [which] will require 50 billion interconnections, half changing each second'.

As a computer person I find it hard to explain Mr Lupo's vision to the lay reader. It seems to be so fevered that I wonder if he has his imagination fully under control. He wants to have star-wars stations orbit- ing the earth, on which sensors for spotting enemy missiles are controlled by neural nets in a state of frantic self-modifying turmoil — an electronic brain changing half of its neural connections every second!

Fifty billion interconnections. . . . If Mr Lupo gets his way the Pentagon will be circling the earth with adaptive spaghetti controlling the nuclear thunderbolts. Nor will the code just lie there. Every second, 25 billion interconnections will change, modifying the code in 25 billion places. Truly, to hark back to the title of our EEC report, there will be opportunity here for mismatch — with a vengeance.

'Now remember, all in one big snort!'