Published in Colloquy, Vol. 14, No. 1, May 1993, pp. 14ff

by the

Security Affairs Support Association

Annapolis Junction, Maryland



Daniel J. Ryan




On August 23, 1992, Hurricane Andrew unleashed 120 mph winds on South Florida, with devastating results. Eighteen schools were destroyed. One hundred and fifty public housing buildings suffered severe damage. Thirty-one libraries had to be closed. Forty fire stations were ruined. Eighty-seven thousand homes were significantly damaged or beyond repair, and over one billion dollars in crops and livestock were lost. The hurricane then crossed the Gulf of Mexico and struck Louisiana with less but still damaging force. It will take an estimated three to five years to rebuild, and the total bill will cost somewhere between thirty and forty billion dollars. Since the United States has the largest economy in the world, Hurricane Andrew was not a knockout blow, or even a crippling one -- but it hurt. A larger disaster, one that did permanent or at least long-lasting damage to the economy, could reach the level of concern for the Nation's economic security. A larger storm, a high Richter-scale earthquake in a densely populated area, or a failure of information systems security could produce such a calamity.


Information systems security is the discipline which protects the confidentiality, integrity and availability of information and the computers, systems and networks that create, process, store and communicate information. The need for protection of the confidentiality of military and diplomatic information or of information that is privileged, proprietary, or private is understood by most people. Computers are designed, implemented and operated to assure that only authorized people can gain access to confidential data. Communications networks use powerful cryptography to ensure that only the intended recipients of messages can read them.

The need to protect the integrity of information is equally important, even when confidentiality is not an issue. Information that cannot be trusted is worse than useless -- it costs money and time to create and store, but provides no benefit. A data base that is only slightly tainted may require extensive resources to correct and validate, if it is possible to recover at all. Similarly, information which is not available when required is of no use, even if its confidentiality is secure and its integrity intact. Systems and networks which are not on-line when needed not only represent a waste of the money they cost, the people and companies which depend on them may be irreparably damaged by operational shutdowns or loss of revenue.

Thus, information systems security has both direct and indirect economic consequences. It costs money, time and other resources to protect information and systems, but there are real and significant costs if we fail to adequately provide that protection and loss of confidentiality, integrity or availability results. Because we are highly dependent on information and the ability to readily retrieve, process, analyze and communicate it, we are highly vulnerable to its misuse, corruption or loss. Fortunately, we have not yet experienced a catastrophic single failure of information security with economic consequences comparable to those of Hurricane Andrew, but it is useful and enlightening to extrapolate potential damage both from the failures that have occurred and from the known economic consequences of closely related failures of systems reliability. In extreme -- but not so improbable -- cases, analysis shows that the economic consequences of failure to protect information could approach or even exceed the amount of damage caused by Hurricane Andrew, directly effecting the well-being and security of the Nation. Conversely, by assuring integrity and availability, and confidentiality where appropriate, we can provide a real competitive advantage to our Country's commerce in addition to protecting the interests and assets of Government and industry in the form of information and the information systems that are expensive to create and maintainand which represent a significant infrastructure investment.


Some information is meant to be available to everyone. Journalists, publishers, librarians and others spend their careers providing public access to a variety of data, news, fiction, and other sorts of information. But some information is intended to be shared only with a selected audience. Such information might be of a personal nature - information that is not considered to be of wide interest, or which might, perhaps, be embarassing if generally known. Still other information could be dangerous to the person or group who owns it if it were widely distributed. Such information includes business information that might be used by competitors to the detriment of the company or organization to which the information applies. "Privileged" information, like that between doctors and their patients or attorneys and their clients, is protected in order to encourage free and open communications within protected circles. The national security and law enforcement communities protect the identities of agents or confidential sources of information. And, of course, military and diplomatic [p.15] information -- so called national security information -- is carefully protected in the national interest.

Disclosure of some sensitive information might result in embarassment or inconvenience. At the other extreme, disclosure of classified information could have serious detrimental consequences for delicate diplomatic negotiations, markedly degrade the effectiveness of expensive high-technology surveillance or weapons systems, or impair the success of military operations. For each of these cases, and at every point in between, there are potentially significant economic consequences of failure to protect the confidentiality of information. In some of the worst possible cases, our high-technology surveillance and reconnaissance systems could be rendered useless -- or worse, misleading -- if their capabilities and limitations were known. Billions of dollars would have been wasted on their development, bad decisions made based on tainted information they gathered, and invaluable lives of U. S. personnel needlessly lost.

As the Cold War and the threat of nuclear confrontation fades, technological and economic intelligence take on increasing importance. While the Commonwealth of Independent States remains a region of great concern, there is increasing awareness of the dangers inherent in technology transfer to third world nations and the proliferation of nuclear, chemical and biological weapons and associated delivery systems. Thus, compromise of information developed in this Country and by our allies that permits us to build and operate highly accurate weapons, such as those used in the Gulf War, and weapons of mass destruction is of great concern. Knowledge about our smart-weapons systems could result in their being avoided or met with effective countermeasures, so that billions of dollars in development, production and operations costs would have been spent for no gain, and battles or wars lost. On the commercial front, loss of confidential information concerning financial and trade issues or proprietary technological developments of commercial importance could harm our Country by reducing our competitive edge. Information about environmental conditions and natural resources may also need protection.

The threat comes not just from our former enemies, but in many cases from our erstwhile friends. The French intelligence service Direction Generale de la Securitie Exterieure (DSGE) has been found using the traditional espionage techniques originally developed to spy on the Soviet bloc to obtain trade secrets from foreign business executives traveling to and in France. Trade secrets so garnered have been passed to French industrial firms which have used them to vie successfully for competitive awards. Other foreign intelligence services have also mounted operations aimed at obtaining U. S. technology secrets. Techniques employed range from intercepting fax, voice and telex communications to bugging hotel rooms and aircraft seats, from stealing a company's trash to bribing its employees. Failure to protect the confidentiality of our information assets can easily mount to staggering sums. Opportunities and contracts lost to foreign competitors mean lost revenues, worsening trade balances, increasing unemployment, and a declining standard of living.


While the economic consequences of loss of confidentiality may be severe, loss of information or degradation of its value may be even more catastrophic. In 1985, a New York bank had software problems that resulted in modifications to its transaction records. To cover its accounts while it diagnosed and corrected the problem, the bank had to borrow over $23 million, at a cost of $5.6 million in interest. Loss of the integrity of data bases, software, and systems in all sectors of the economy can have both economic and safety consequences. Consider that the air traffic control system, stock transactions, financial records, currency exchanges, Internet communications, telephone switching, credit records, credit card transactions, management information systems, office automation systems, the space program, the railroad system, hospital systems that monitor patients and dispense drugs, manufacturing process control systems, newspapers and publishing, the insurance industry, power distribution and utilities all depend on computers. The law enforcement community also relies heavily on the integrity of information and information processing systems.

Integrity of information can be threatened by a variety of means, including physical destruction of the systems that create, process and communicate information, or destruction or erasure of the media containing the information. Destructive programs called "logic bombs" may be placed in data processing systems and networks where they lie in wait for either a specified set of conditions or the passage of a specified length of time. Then they wake up and destroy the information in the computer or its data storage peripherals. Take, for example, the case of a young programmer who placed a logic bomb in his company's personnel system. The malicious program checked periodically to see if his name was still on the list of employees. When he was fired, his name was removed from the list and the logic bomb destroyed the companies personnel records. In another case, on April 11, 1980, many IBM 4341 mainframe computers shut down due to a logic bomb that had been planted by an unhappy employee. In yet another instance, a medical center lost nearly 40% of is data to malicious software.

Computer viruses may also destroy data. The news media has widely reported the discovery of the Michaelangelo virus, a time bomb set to go off on March 6 (Michaelangelo's birthday) of any year and destroy the contents of a personal computer's hard disk. Thousands of viruses are known, many of which destroy data, and [p.24] more are appearing at the estimated rate of twelve per day. As our systems become more interconnected and interoperable, the question becomes not whether your system will become infected, but how soon and how often. Over $100 million was spent by U. S. industry to avoid the effects of the Datacrime virus scheduled to wreak its destruction on Columbus Day in 1989. The Congress spent over $100,000 to repair the effects of a single virus attack. In 1991, it is estimated that viruses caused $1.077 billion's worth of damage. The widespraed and rapid proliferation of personal computers in homes, offices and schools prevents a precise measure of the actual damage.

Malicious programs which corrupt or destroy information may not simply delete files or erase disks. In one case, the management information system used to guide a major corporation was changed by the manager of Research and Development so that the data displayed in the President's spreadsheets were altered by a few percent. The R&D manager hoped that the President, relying on his computerized analyses, would make bad decisions, be fired and the R&D manager would be chosen to replace him. As it turned out, the malicious code was detected and it was the R&D manager who lost his job.

Such attacks are especially hard to detect. One datum might be changed at random every few days. Since changes may be slight, they are not as obvious as missing files or mangled data. If alterations occur over long periods of time, even periodic backup processes may not avoid their effects. Enormous amounts of money and time may be required to recreate a corrupted data base, if it is possible to repair the damage at all.


Most people don't think about, and many are unaware of, the fact that our phone system-- the so called public switched network -- is in reality a computer system. When we pick up our handset to make a call, it isn't obvious that modern phones are themselves computers which are connected to other computers for local calls via the local switching office, and from there via trunk circuits to other switching offices around the world. With the exception of a small number of rapidly disappearing electromechanical switches in low-density rural areas, all switching and control functions are carried out by computers. En route between switching computers, calls may traverse copper wires, coaxial cables, microwave radio links, fiber optics cables, and satellite up- and down-links. Despite this complexity, the phone system in the United States is one of the most reliable systems in the world.

Even so, at 2:25PM Monday afternoon, January 15, 1990, the AT&T long distance network comprising 114 switching centers, each containing a main and a backup computer to ensure that the system could handle every conceivable problem, began to falter. Within minutes, more than half of all calls being attempted by AT&T customers were answered by a recorded message informing the caller that, "All circuits are busy. Please try again later." Not until 11:30PM, some nine hours later, would the network return to normal service.

The economic consequences were significant. AT&T estimates that it lost $75 million in lost calls alone. Of 138 million long distance and 800-number calls, some 70 million were rejected by the faulty system. Many of those calls were business calls, and the failure to connect cost those businesses directly due to orders not being placed and operations being delayed or halted altogether. There were indirect costs as well due to decreased efficiency and productivity. Some businesses, like the New York Stock Exchange, had made arrangements for backup service and so were less affected; other businesses which had not had the foresight to buy backup service were out of business or severely hampered. Airlines, hotels and car rental companies lost reservations. Phoned catalog orders were not placed. Service companies could not support their customers. Undoubtedly some of the revenue those companies lost was gained by other companies that didn't use AT&T, but some were lost forever. The total economic consequences? Unknown and probably unknowable.

Unfortunately, the January, 1990 incident wass not an isolated case. On June 10, 1991, more than a million Pacific Bell customers in the Los Angeles basin lost phone service for ninety minutes. Soon thereafter, on June 26 and 27 of the same year, ten million phones in four widely separated U.S. cities went down. More millions of dollars were lost.

In the world of computer systems and networks, too, there are analogies to the outages of the telephone system. The Internet is a worldwide network consisting of over 5,000 subnets, each of which connects from one to dozens of computers and terminals to any other user on the net. On November 2, 1988, a small program appeared on computers connected to the Internet. This program was a "worm" -- a program (much like the computer viruses that have been widely publicized) that makes copies of itself and sends the copies along to other computers on a network. The copies make copies in turn and send them along, and the copies' copies make copies, and so on. The result is like a chain reaction in nuclear physics, and in short order the network was so busy creating and sending copies of the worm that it couldn't do anything else. In a wide-area network like the Internet upon which thousands depend, the consequences were serious.

When the Internet worm struck, it was immediately feared that the program might be a Trojan horse, an apparently innocent program that contained destructive code to be activated at some later time or date. Others were concerned that the worm was placed into the net by an enemy power, either to compromise or destroy information or to disrupt services. Many system operators were so concerned that they shut down their com- [p.30] puters and brought all usage, including their research and communications to a stop. Some of the computers were down for most of the following week, with attendant economic consequences.

Fortunately, the Internet worm was not a Trojan horse or other type of logic weapon. It clogged up the Internet and denied the benefits of the net to many of its users, but the program was basically benign, the result of a graduate student's research that got out of hand. Even so, the economic damages resulting from the denial of service were impressive. One estimate of the damage reached $116 million, with up to six thousand computers affected. Damage at individual sites ranged from $200 to over $50,000.

Up to the attack of the Internet worm, no one had seen such a widespread network infection. But in fact, the damages were small in comparison with what they might have been had two things been true: first, today there are many more computers and more of them are interconnected on networks, so damages would be wider spread and their total sum higher. The growth of computers and networks since 1988 has been exponential in scale. The damage consequent to a similar incident today would be catostrophic; the damage five years from now would be unimaginable. Second, the Internet worm was not designed to compromise or destroy information, but merely to replicate. Had it been designed to destroy data, the economic consequences of the worm attack could have reached many times the level they did.


Beyond the direct and indirect consequences of failures of information systems security in protecting confidentiality, integrity and availability, both fraud and theft are information systems security problems. Computer crime may already be costing the economy as much as $50 billion annually -- more than Hurricane Andrew each year -- and the total is growing every year.

"Phone Phreaks" are hackers who specialize in understanding and manipulating the telephone systems. In addition to the danger that they will inadvertently or deliberately shut down all or part of the phone system, phreaks steal services, either by fooling the system into thinking that no charge is necessary or by having their charges appear on someone else's bill. Telephone industry losses to phreaks approached $2 billion in 1992.

Credit card fraud is another form of information systems security failure. Lost or stolen credit cards together with fraudulent use of misappropriated credit card account numbers costs credit card companies over $1 billion per year. While much of this loss might be avoided with improved information security, the credit card companies treat such losses are simply a cost of doing business. The losses are quietly passed along to their customers in the form of higher rates, and the Country's economy is the loser.

Still another form of loss occurs when computer time is stolen by unauthorized users. In 1991, computer hackers cost the United States economy more that $164 million. One hacker logged up a significant amount of stolen computer time on a supercomputer by breaking into the system and playing computer games. In another case, over half of the log-ons to an unprotected Government computer were unauthorized. Two-thirds of the same system's use time was by users who didn't have valid accounts and weren't supposed to be on the system. This means that the system was three times as large as was needed to perform the actual work for which it was intended. Assuming that the system was sized based on experience, millions of tax dollars could have been saved by buying a system properly sized to the work intended. There are also increased operational and maintenance costs which dwarf the original investment and more than offset the cost of the computer security that would be needed to prevent unauthorized use.


Information systems security costs money. The Government and private industry together spend billions of dollars for communications security, computer security, physical security of information and information processing systems, and personnel security in order to protect information assets. However, the cost to our economy of not securing our systems and protecting our information is already high and is potentially much higher. Individual instances of failure of confidentiality, integrity or availability have cost hundreds of millions, and the cumulative costs of failures of are astronomical. Cumulatively, the costs of information systems security failures already equal or exceed the cost of Hurricane Andrew in most years. In the past, the diversity of our systems and networks has protected us, at a cost to productivity that we cannot sustain if we are to be competitive in world markets. But greater connectivity and greater interoperability mean greater vulnerability, both to accidents and to malicious attacks. In the worst cases, the entire economy could be damaged and the Country consequently put at risk.

The challenge is great. Fortunately, technology together with well-planned and executed security procedures can ensure the confidentiality, integrity and availability of our information assets. As part of the Global Village, we will then be able to safely share in the rich exchange of information needed to support competitive economic activity.