Inside the mind of a Designated Approving Authority
- By Henry Sienkiewicz
- Oct 10, 2013
As a former DOD official, I was frequently asked, “What does a DAA think about?” It’s a great question given the current focus on cybersecurity and the perceived difficulty of moving from development and testing to an operational environment.
As one who served as a DAA, but came from an operational and software development background, my response was often, “A lot.”
I can sum up what most DAAs think about in three words: context, transparency and risk. DAAs look at the technology context, program transparency and the overall risk to the environment.
First and foremost, context is fundamental.
DAAs are in the unique position of having to make hard decisions about their environment. As the final check before production, DAAs have a fascinating and tough job. What they decide has an immediate impact on their organizations.
While not formally part of the operational security chain, DAAs take operational security and, more importantly, operational needs into account. They acknowledge the increasing cyber threats, changing IT consumption patterns and increasing budgetary pressures. They also recognize there is seldom a way to completely eliminate risk. The only risk-free information technologies are completely isolated ones.
In an “Internet of Everything” world, most government IT is found within a highly integrated and interconnected grouping of networks, systems and applications. DAAs obsess about the interrelated conditions in which their systems exist. Within this ecosystem, there’s always the potential for gaps that allow compromise.
DAAs not only think about information security. They must make authoritative decisions about information security. They and their organizations must live with the results.
I, like other cybersecurity professionals, understand the ecosystem through a series of “cyber planes,” primarily technology, components and information. The technology plane includes leading-edge research and capabilities that provide the nation with a decisive edge. Attention focuses on protecting secrets.
Components include mission-critical hardware and software, generally sourced through a commercial supply chain. In this case, the emphasis is on keeping bad things out.
The third leg of this triad is information. In this instance, the focus is on ensuring that organizations are able to keep critical information from getting out.
In a hyper-connected world, these three planes are then overlaid by three other concepts: how systems are being constructed and delivered as well as where the systems are being used. Systems providence means that organizations know who built what. Where systems come from has become more complicated in the world of software as a service, data center consolidation and cloud computing.
Finally, it matters where information is delivered. There are different cybersecurity needs for warriors in a deployed environment than for teams working on a budget submission.
DAAs also think about the gaps that allow for compromise across these major planes and concepts. They must visualize potential attacks against the enterprise by building models that use the results from vulnerability scanners, asset management, firewall rules and other data sets. The result is a visual understanding environment that maps concept and context.
One gap I see in contextual understanding is that while numerous tools exist for locating digital information, there are few applications available for making sense of available information. What is lacking are cognitive tools that allow an organization to visualize its digital content and context in nonlinear ways, thereby exposing potential seams within the operational environment.
Henry Sienkiewicz (@hjsienkiewicz) is COO of Network Design Implementation (www.ndiinc.com). He previously served for two years as CIO at the Defense Information Systems Agency and three years as DISA’s Designated Approving Authority.