HIDDEN THREATS AND VULNERABILITIES TO INFORMATION SYSTEMS AT THE DAWN OF A NEW CENTURY

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

I find it amazing that this article was written over a year ago. Still good reading today.

By Richard F. Forno (rforno@ibm.net)

Today, terrorists recognize our high-tech conventional forces are not coming to grips with the unconventional threats. This is exacerbated by a Department of Defense whose addiction and over-reliance on technology is making our military and critical infrastructures extremely vulnerable to unconventional cyber attacks . This trend is leading more and more of our adversaries (current and potential) to seek new ways to subvert and subdue American interests. With the end of the Cold War, what once constituted a threat to the United States must now be redefined.

Technology, combined with the creative genius of military thinkers around the world, is leading to the development and application of new forms of warfare, and the innovative modification of traditional military practices. While the United States and its allies are the source of much of this innovation, others are motivated by the dominant military position of the United States, and its demonstrated commitment to maintaining its military lead. This basic reality is forcing many of the nation's adversaries (current and potential) to seek other means to attack American interests. Lieutenant General Patrick Hughes, USA, Director of the Defense Intelligence Agency in Washington, DC, mentioned these (and several other) items in a recent Congressional testimony. From a national security perspective, he cites the following threats to American information systems:

7 Trusted insiders who use their direct access to destroy or manipulate the information or communications system from within.

7 Modification of equipment during transport or storage.

7 Physical attack of key systems or nodes, including the insertion of modified or altered hardware.

7 Network penetration to include hacking, exploitation, data manipulation, or the insertion of various forms of malicious code.

7 Electronic attack of various interconnecting links, sensors that provide data to the system, or other system components.

7 Empowered agents including "sponsored" or individual hackers, cyber-terrorists, criminals, or other individuals who degrade, destroy, or otherwise corrupt the system. In the most advanced case, empowered robotic agents, embedded in the system, could be used to take autonomous (timed) actions against the host or remote systems or networks (cyber war).

Essentially, in the Age of Information Warfare, one is either a target or a victim. In other words, a target has defenses against attackers while victims are defenseless. On a national, strategic level, following General Hughes' high-level list above, there are a number of intriguing target possibilities, including:

Electronic Switching System (ESS) - Nationwide system that manages all telephone communications. Consider the consequences if the nation could not communicate via the telephone or dial-up Internet access.

Global Positioning System (GPS) - US-developed constellation of geosynchronous satellites that provide excellent navigational data for civilian aircraft, ships, and handheld units used by campers. Provides precise information to US military units and attack systems.

Internet - the communications backbone of science, industry, and society.

Commercial Operating Systems and Applications - This is an accident waiting to happen. What about commercial off-the shelf operating systems that run major networks for large government agencies and companies? Who knows what lives inside these "untrusted binaries" in such widespread use around the world? Users and administrators must be on constant alert to the almost-weekly announcements of a new vulnerability in these systems and be prepared to implement corrective action immediately to avoid potential threats to the integrity of their data and networks. The same can be said for financial and other business-critical applications that are used in conjunction with these untrusted operating systems.

One serious vulnerability not discussed in many circles is the sad but true fact that the mission-critical systems and infrastructures (financial, power, and most business or government systems) of the United States and elsewhere are run by commercial operating systems and software applications purchased with the assumption that such products are secure as shipped from the manufacturer. Unfortunately, this is not the case, and numerous vulnerabilities have been discovered in systems that were marketed as allegedly "secure" to industry or government specifications. Why? Some software companies are more concerned with profit, market share, and putting competitors out of business than they are with producing a quality software product that provides reasonable levels of security and acceptable levels of risk to the user. Granted, total security is as real as the Tooth Fairy, but stronger quality assurance must be taken on these products the world is now relying on.

Today, unfortunately, slip-shod products are rushed to market quickly, being driven by their competitor's schedule or internal marketing efforts. This effectively turns the consumer and corporate markets into expanded, "beta" testers who, instead of being paid to examine a piece of software for quality, pay the manufacturer for a license to own a product and stand a good chance of having to absorb the costs of securing, recovering, or restoring their systems and data resulting from issues arising from a shoddy product nobody outside of the vendor has examined! During their use or "examination" of such products, systems routinely crash, data gets lost, or other issues arise that comes from implementing such untested software. While not an "external" attack to information systems like a hacker or cracker, such untested software applications are an equal threat to the sanctity of corporate data and information resources and the infrastructures relying on such products.

It is rumored that the Microsoft Windows 95 operating system installed by the consumer masses, shipped with over " 5,000 known bugs." Not unexpectedly, corporate and consumer clients complained about the product. The upgrade product, Windows 98, was rushed to market in late June 1998 to spite an ongoing United States government court case, and fixed "about 3,500 known bugs in Windows 95" and was intended to be "a tremendous remedy to the annoyances suffered by Windows 95 users." Ironically, a service pack, euphemistically dubbed a "Multimedia Enhancement Pack" was released less than sixty days after this product hit the market to not only enhance the product, but "quietly fix" some of the bugs that were still in the product when it shipped! A quality product? Sure, if the company considers the world consumers to be unpaid "quality assurance" or "continuing beta testers" for such software. As an aside, the auto industry recalls vehicles with defects in them and fixes such defects at no charge to the "owner"but the computing industry requires that its "users" fork over money to get such defects and dangers to their data fixed. To be fair, the Windows 98 product is more stable and robust than its predecessor, although it ships with several controversial features seemingly placed in the product for product placement than end-user utility.

The installation of and subsequent reliance on such systems that have not undergone peer review or independent analysis is an accident waiting to happen. While items like UNIX (an open operating system that "runs" most of the Internet), Pretty Good Privacy (the de facto Internet encryption tool), and Navigator (the first, and some would say only reputable Web browser) have released their source code to the world for public analysis, disclosure, and discussion, many of the world's largest operating system and applications vendors - particularly Microsoft - do not, citing "proprietary trade secrets." In these cases where software has undergone worldwide peer review, the result is that user concerns and quality control issues are addressed before the product hits the open market, not after, where a considerable user base exists and is potentially threatened by bad code. Further, users have the opportunity to see how the programming code will interact with existing applications, much like checking a medical prescription for any potential drug interactions or side effects. Software that has been examined by "independent third parties" stand a better chance of being accepted as indeed "secure" and "stable" than products where the vendors announce "our product is securetrust us!"

An example of the user community's reluctance to sleep well and rely on untested proprietary software is found in government circles in the early 1990s when the National Security Agency and National Institute of Standards and Technology attempted to create a standard encryption system for the United States to replace the antiquated Data Encryption System (DES). "Use it," they said in official reports, "but the encryption algorithm is classified TOP SECRET and not available for independent review." While the implication was "trust us, we're the government" -- the product flopped and was declassified in mid-1998. Some would argue that the reason why UNIX, PGP, and Navigator became de facto user products in the computing community was that the software was reviewed by outside experts who certified the products, algorithms, or software code were robust, stable, and worked as advertised or intended.

While lucrative for security professionals, the increase in known vulnerabilities associated with such "proprietary" systems is disheartening. Where is product security, stability, and reliability for the "good of the customer base"? If the industry continues to develop insecure, untested, programs and operating systems - and prohibits independent testing and analysis - the future for truly secure operating systems - and systems in general -- is fading rapidly from reality.

In July 1998, news surfaced that the Navy's first Commercial-Off-The-Shelf ship, the Aegis vessel USS Yorktown, had a systems failure only hours after departing Norfolk. The ship's Windows NT network crashed and rendered the vessel unable to continue its mission. Why? Who knows what other applications interacted with the NT software to cause the crash. Can the Navy dissect the NT operating system to find the flaw like they can in UNIX? Not a chance. The hacker and quality assurance communities had a field day with this latest blunder, dubbing Windows NT as "Needs Towing" or "Needs Tweaking."

Yet the Navy is going ahead with plans to standardize fleet information systems to this "secure, stable, and robust" operating system -- most likely due to its (and the world's) blind dependence (or addiction) on the uncustomizable, unmodifyable Microsoft line of products to run computers as the "only" game in town. It is truly mind-blowing to think that large organizations are moving away from an open, customizable environment (such as LINUX or Sun Java) to a closed, proprietary, and non-customizable one from Microsoft. In the physical world, if the auto industry shipped vehicles that stalled in the middle of the highway every twenty-five miles, lawsuits would abound!

One final note. The majority of commercial software and services are produced by American companies, many of which are written by foreign nationals employed by the software companies working on visas in America or back in their home nations. This is a major concern to government organizations who try to monitor personnel with critical access to systems and information. How easy might it be to co-opt a programmer in India to place some small backdoors for the Indian government to have secret access to any Windows NT server? Given the poor quality assurance measures in the industry today, it seems very easy. Suppose these Indian programmers inserted some malicious lines of code into NT as a way of "getting back" at the US after it imposed economic sanctions on their country after their recent rounds of nuclear testing in early 1998? Not a pleasant thought, but a very real vulnerability. A good number of programmers and consultants working the Year 2000 issue are foreigners who are granted nearly unlimited, unfettered, and unmonitored access to the mission critical systems of our largest corporations and government organizations without any criminal background checks. Need we say more? Also consider foreign-built firewall software that allegedly protects sensitive networks, both military and civilian - has anyone seen the actual code of this foreign-made "barriers to entry" for our sensitive networks? Rumors abound that some foreign-made security products are co-developed or "endorsed" by foreign intelligence services as a method of penetrating networks through hidden holes in the software. Truth or myth, one will never know. However, it is a great concern among many organizations that have refused to install foreign-developed security software such as firewalls, encryption, or virus-scanners.

There are hidden programs, routines, and "Easter eggs" such as small flight simulators and pinball games hidden inside such products by their programmers, which perhaps adds to the size, complexity, and problems running the software. Will the Navy or the rest of the world know what evil or "treats" lie in the 40- million-plus lines of programming code that constitutes Windows NT, Internet Explorer, or Word? Probably not. Will we still run the software and put up with the crashes, hiccups, and reboots associated with these products? Sureit's a "feature" and currently an acceptable level of risk to the world. Unless the NT server crashes and the famed Blue Screen of Death appears while targeting a Harpoon missile, that is.

Sadly, most policymakers and many CEOs are not products of the "Communications Revolution." They do not understand programming code, the critical value of information, or the inherently "virtual" way the world works, not to mention the vulnerabilities inherent with the growing reliance on information infrastructures. Everyone plans for the major military offensive through the procurement of high-profile and glitzy weapon systems, but no one is planning for the critical defense of our less visible - but equally critical - interior vulnerabilities, the "Soft Underbelly" of the country.

Currently, the majority of American infrastructures and vulnerabilities are owned and operated by the private sector, with the government "leasing" or "renting" services, bandwidth, or products. To insure the integrity of these systems, the government's mantra professes the need for a "public-private partnership" and information sharing operation to defend the soft underbelly of the country. While pleasant on paper, one must face the reality of the issueIndustry does not talk openly with the Government, and the Government does not openly talk to itself, as evidenced by the fractured nature of its response to the threat of information and infrastructure warfare in the late 1990s.

It does not take a genius to develop tools or applications to effectively bring down one of today's mission critical COTS systems. Indeed, there are a multitude of free "hacker tools" and several legitimate diagnostic tools that can be used for both good and malicious purposes. In short, the greatest vulnerability of America's information resources is a lack of knowledge. A lack of knowing what is contained in these programs and operating systems that drive today's world, and a lack of education and awareness by both senior executives with purchasing authority and end-users who (since they don't know any better) assume that a system that repeatedly crashes is operating "as expected."

The attack will come. A strong defense will be necessary.

-- y2k dave (xsdaa111@hotmail.com), December 27, 1999

Answers

Dave,

The technilogical overkill was aptly symbolized on a Discovery channel show on our new state of military soldier. They had this guy with about 70lbs. worth of computers strapped to his back head and face.

The first thought I had was--how is this guy going to fight if....

-- d----- (dciinc@aol.com), December 27, 1999.


Moderation questions? read the FAQ