Industry Perspective — Commentary

Coding errors open thousands of new security holes each year

New tools — and product liability — are needed to curb software vulnerabilities

The United States is the most technologically advanced nation in the world, and that's caused many sleepless nights for officials of the Defense and Homeland Security departments and those in other parts of the executive branch.

Should software companies be held accountable?

Yes. When you buy critical software, it's important that the vendors stand by the security of their products.

No. There are no guarantees in this world. No matter how much security a vendor claims, it's understood there are always going to be vulnerabilities.

It's unclear whether companies should stand by the security of their products, even if it means defending them in court. Cyberattacks and countermeasures are proliferating too rapidly to do so.

Base: Approximately 400 respondents familiar with military and technology matters.

Source: DefenseTech.Org and the Technolytics Institute survey.

The vulnerability of the country's critical infrastructure and the information systems that support it has been called the soft underbelly of the United States. Defense Secretary Robert Gates has said the United States is “under cyberattack virtually all the time, every day." The private sector along with government and military leaders have expressed their concern over the nation’s exposure to cyberattacks and have urged that immediate measures be taken.

Meanwhile, software is involved with almost everything. The average vehicle has more than 70 microchips, and cell phones have more software than the first PCs. It is hard to find something that is not electronically controlled, thus having software code in it.

Consequently, programming errors in software create vulnerabilities in our systems that criminals, terrorists and the militaries of rogue nation states can exploit.

Estimates suggest that programmers around the world churn out more than 100 billion new lines of product code for commercially available software put into operation each year. Meanwhile, programming errors happen at an estimated rate of 15 to 50 errors per 1,000 lines of code. Automated testing tools — on track to become a $50 billion industry by 2013 — along with manual code inspection removes the vast majority of those errors. However, numerous studies and a substantial amount of research suggest that approximately one error per every 10,000 lines of production code still exists after testing. That would equate to 10,000,000 errors in the code produced each year.

If only 1/20 of those remaining errors in the production code are exploitable and create a security vulnerability, that means we add an additional 50,000 security holes annually. A report by IBM’s X-Force said it had documented a record number of 7,406 new vulnerabilities in 2008. That would equate to about 15 percent of the estimated 50,000 security holes being discovered, if our math and estimates are correct.

X-Force also said more than half of the 7,406 vulnerabilities discovered and disclosed during 2008 had no patches or fixes available from the vendor by the end of the year, and nearly 46 percent of vulnerabilities from 2006 and 44 percent from 2007 still had no patch by the end of 2008. The software industry cannot fix what it finds, much less look at what other security vulnerabilities might be present in the production code.

This affects our national security. Think about the amount of commercial software used throughout the defense industry. It’s hard not to ask: What about software product liability? Should companies that produce software be held accountable for those errors?

I recently conducted an unscientific poll of semitechnical individuals from the defense and technology industries on a military cyber warfare blog I maintain. The poll was a collaborative effort between DefenseTech.org and the Technolytics Institute.

As the chart shows, out of nearly 400 respondents, the majority, or roughly six of 10, favored holding software companies accountable for security problems with their products. Even if you combine those who were unsure with those who were against holding the vendors accountable, those favoring accountability win.

We’ve looked at the errors that accidentally happen during software development. What the number doesn’t reflect is malicious code inserted into software. What about software espionage? In 2007, a defense contractor with a top-secret security clearance confessed to programming malicious software codes into computers that track Navy submarines. Consider how much of the software used by defense contractors and the military is developed or inserted in electronic devices and equipment outside the United States.

Although that it is impossible to quantify, the general answer is a significant amount. The BIOS is software code that is embedded deep inside of microprocessors. The software code controls how the chip operates, especially in the boot-up process of an operating system. Current security measures provide little if any protection from exploitation at the BIOS level.

Security must be built in, not bolted on to all products with software components. Research funding is needed to develop new tools and techniques to deal with this threat. There are measures that can be put in place to reduce those threats, but the remaining risk level is still too high given what is at stake.

About the Author

Kevin Coleman is a senior fellow with the Technolytics Institute, former chief strategist at Netscape, and an adviser on cyber warfare and security. He is also the author of "Cyber Commander's Handbook." He can be reached by e-mail at: kgcoleman@technolytics.com.

Reader Comments

Wed, Oct 7, 2009 Mark Kagan Washington, DC

I wrote a case study earlier this year about the Air Force Application Security Assurance Center of Excellence (ASACoE), which comes under the Electronic Systems Center (ESC) -- so I can say that at least part of the Air Force is very aware of the danger and IS doing something about it. However, the ASACoE's activities do not yet extend to the entire Air Force; it still has to deal with a great deal of legacy code as well as new applications; and it doesn't cover all types of coding. The biggest lesson I took away from my interviews and research is that there absolutely must be high-level buy-in that makes secure code a high priority (and in the case of vendors, a requirement before buying) and provides the necessary funding. The reality is that that this will probably not happen until major damaging hacks occur that directly affect the powers that be or seriously damage critical missions. That was what happened in the ESC back in 2005. Samuel Johnson said that the knowledge that one is going to be hanged in a fortnight concentrates the mind wonderfully. In this case, it requires one to already be suspended at the end of the rope.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Your Name:(optional)
Your Email:(optional)
Your Location:(optional)
Comment:
Please type the letters/numbers you see above