The wetware problem: Humans still the weakest link in network security
People create a variety of security issues for defense networks
- By Henry Kenyon
- May 02, 2011
Humans are an important part of computer networks. There is hardware and software, and then there are the biological users of the systems, wetware.
Wetware is also one of the vulnerability points in a network that is often overlooked by designers and network administrators, Robert Axelrod, Walgreen professor for the study of human understanding at the Univeristy of Michigan, said May 2 at the Department of Defense Intelligence Information Systems conference in Detroit.
Wetware’s many vulnerabilities include unauthorized system access, falsifying reports, using a guessable password, intoxication and incompetence. These issues can be countered with propter recruitment, training, incentives and supervision, but there is more to security than those issues, Axelrod said.
He described a “Swiss cheese” network security model where the varying layers of network defenses provide security, but only if the support staff truly appreciates it. There is a feeling among many employees that their level is not the critical one. “We have to take seriously that our own layer might be the last,” he said.
Another security issue is professional loyalty between peers. Axlerod said that, although this is a commendable and important thing, it can manifest itself in situations where individuals are more loyal to each other than to the organization they serve. For example, 109 cadets at West Point in 1965 were expelled for cheating. This was in spite of the academy’s honor code, which stressed turning cheaters into the authorities.
This was an example of horizontal loyalty where the cadets were more loyal to each other than the academy and clearly knew about the cheating, yet did nothing, Axelrod said. But cheating is not restricted to West Point. All of the other service academies have had regular bouts of cheating for decades, he said.
Sometimes, organizations avoid or overlook security issues because of embarrassment. One example was the Army’s online testing system. The service did not investigate reports of online cheating for eight years until the Boston Globe ran an article citing 40,000 cases. According the Globe, many internal complaints were turned away. “Much of the reason for this was that the people running the tests didn’t want to embarrass the organization,” he said.
Security functions can also be seen as inconvenient and are ignored or shut off. In the Deepwater Horizon disaster, an alarm system went off so often at night that the crew turned it off. Organizations must have processes in place that make it easier to report incidents. Likewise, designers must be able to create alarm systems accurate enough to sound during a security breach but not so sensitive as to suffer false alarms and false positives.
Surprise is another issue. Software has known vulnerabilities, but these weaknesses can be patched and prevented after they’re observed. Because of their nature, nations will save little-known exploitable software vulnerabilities for use in high-stakes events such as a war, Axelrod said.
Once saved, these capabilities can lead to great uncertainty about a nation’s offensive and defensive capabilities. “This is a very dangerous world. We can’t count the number of nuclear missiles and we don’t know their capabilities,” he said, describing how nations view each other’s offensive cyberspace capabilities.
How a nation conducts its attacks also serves as a pattern for analysis. China is an example of military operational patterns. Axelrod said that in every war it participated in from Korea to India and Vietnam, the pattern has been consistent: small actions by its troops designed to serve as a warning, followed by massive aggression.
To avoid human-based network problems, organizations must teach the Swiss cheese network security approach and reconcile no-fault incident reporting with no-fault accountability. Organizations also need to develop systems designed to encourage the reporting and fixing of problems, Axelrod said.
Henry Kenyon is a contributing writer for Defense Systems.