High-Tech Stakes Hike Peril of Bugs : Upgrading: In the race for more sophisticated software, there is a growing reliance on life and death applications, such as flying an airliner. - Los Angeles Times
Advertisement

High-Tech Stakes Hike Peril of Bugs : Upgrading: In the race for more sophisticated software, there is a growing reliance on life and death applications, such as flying an airliner.

Share via
ASSOCIATED PRESS

Take a look around. What keeps things going in your life?

Trees and water. Some grains. Grease and gas. And a bunch of software.

Without fanfare, software has become common not just in the machines we call computers but also cars, stereos, watches, even electric shavers.

Used to create more productivity, software is great. But, increasingly, the inconvenience of a mistake or glitch in software causes more trouble than if software hadn’t been there in the first place.

Examples of this trade-off are everywhere. On the small scale, a car may not start due to a bug in the engine control computer or a grocery clerk may be unable to check out customers when the system goes down.

Advertisement

More broadly, Denver International Airport’s opening was delayed for 18 months because software couldn’t control the 4,000 baggage cars that it was supposed to. And some missing lines of code caused a flaw in Intel Corp.’s Pentium chip that cost the company a few hundred million dollars.

“People are not even aware as they walk through their daily life how much they are at the mercy of what software designers have done,†said Stephen Rosenthal, a Boston University professor and co-author of a book on the rising influence of software programmers.

There is considerable pride among programmers that their craft walks the line between art and science. But there is also fervent debate about whether software would be consistently better if the craft were infused with the discipline seen in chemical or mechanical engineering.

Advertisement

Some experts believe so, but, because natural laws are less of a bind on software than on chemistry or physics, progress toward more trouble-free software has so far been uneven.

“The major difficulty is that software engineering is tackling all of the problems of civilization,†said Bev Littlewood, director of the Center for Software Reliability at the City University of London. “If you look at the problems that civil engineers tackle, for example, they’re rather circumscribed. I mean, bridges are very much similar to one another.â€

Scientific American, in an article last September dubbed “Software’s Chronic Crisis,†said software engineering must improve or “society’s headlong rush into the information age will be halting and unpredictable at best.â€

Advertisement

Several leading software designers reject the notion of a crisis. But they admit the quality stakes are getting higher as software comes to be relied on for things that can spell life or death, such as flying a passenger jet.

“We’re in a bit of a crazy spiral here because technology-driven companies cannot get timid,†Rosenthal said. “If they get conservative and worry about what can go wrong, they aren’t going to promise the next breakthrough and their competitors will.â€

That predicament is what Nathan Mhyrvold, head of the advanced technology group at Microsoft Corp., describes as “the perspective that everything we do is the walking deadâ€--as soon as software is written, someone else comes up with something better, forcing everyone to keep writing more programs.

*

This is a pressure felt by most companies and institutions today, not just the so-called software industry, which actually produces less than 10% of the programming turned out in the United States every year.

For instance, the Federal Aviation Administration, after more than a decade of trying to improve the computers used for air traffic control, is on the brink of failure. Most of the $6.9-billion project was canceled or scaled back last year because of poor software. A decision on the project’s future is expected this month.

The bigger a software project is, the more likely it is to be delayed or canceled, Scientific American reported, citing a number of studies.

Advertisement

To understand why, it is useful to think of software as “distilled complexity,†said Charles Simonyi, a prominent software theorist at Microsoft.

The job of a typewriter, for example, has been transferred to a computer with software that can produce more versatile documents and do so with fewer moving parts.

“It’s not like we simplified the problem. We are actually trying to print something that is much more complex than before,†Simonyi said. “The difference is that all the complexity has been distilled and is expressed in the software.â€

But having software do things in place of well-tested mechanical processes, such as those that fly a plane or protect a nuclear plant, may be very risky.

“Often the price that you pay is extra complexity and therefore quite a lot of uncertainty about how well the final product will actually behave,†said Littlewood, the software reliability expert in London.

As a result, many argue, software engineering has to improve.

“We now have to exhibit the same discipline that other engineers have had to,†said Larry Druffel, director of Carnegie Mellon University’s Software Engineering Institute. “Almost every engineering discipline at some point was an art before it had to step up to the challenges of being part of the marketplace.â€

Advertisement

*

The institute, which is funded largely by the Defense Department, is a leader in the drive to improve software quality. A yardstick it developed, called the Capability Maturity Model, has come to be used by many companies outside the defense industry to measure software design process.

About three-fourths of all companies that have measured themselves are at Level 1, the lowest rating. Two programming teams have earned the highest rating, Level 5. One is a unit of Motorola Inc. that’s based in India and the other is the Loral Corp. group that designs code for the space shuttle.

The shuttle’s on-board computers were designed in the late 1970s and have just 1 megabyte of main memory for a program to flow through. Most personal computers today have 4 megabytes of main memory.

That constraint has forced NASA and Loral to create very efficient software. Even so, a programming glitch held up a shuttle launch in 1993.

But better process doesn’t necessarily have to be shaped by ironclad rules.

“One of the things we do now is require developers to hold peer code reviews on any significant product change they make,†said David Moon, vice president of development at Novell Inc.’s WordPerfect unit. “There’s no management in the room. They all walk through the code and they make comments and say, ‘Look, you can do better if you do this.’ That has helped a lot.â€

*

A development to watch is object-oriented programming, in which components known to be reliable can be reused for different purposes. A bank, for instance, can use the same portion of code for taking a customer’s address on a loan application as it does on a checking account.

Advertisement

Such reuse is already a key efficiency measure for software writers.

Microsoft’s Simonyi is working on an innovation for software reuse called “intentional programming,†in which a person can more skillfully impart what he or she is trying to do with a particular piece of code.

Programmers who reuse code now don’t know why the original code was formed the way it was. They must use it without much change, a condition Simonyi compares to being able to read Shakespeare’s “Hamlet†only from old books with outdated spelling.

“That’s not what ‘Hamlet’ is. It’s not the paper. It’s not the spelling. It’s the author’s intention,†he said. “That’s why intentional programming is what’s invariant.

Advertisement