Joint Staff achieves intelligence superiority through collaboration
Web tools provide swift and precise situational awareness
Army Brig. Gen. Michael Shields is director of the National Joint Operations Intelligence Center at the Joint Chiefs of Staff, a position he has held since March 2009. NJOIC is a crisis response organization that supports the joint chiefs and the defense secretary with situational awareness and global crisis monitoring. The monitoring extends from piracy at sea and flu pandemics to disaster response, such as the oil spill in the Gulf of Mexico, and humanitarian assistance in cases such as the earthquake in Haiti. He recently spoke with Defense Systems contributing editor Barry Rosenberg about how the joint staff is using Web 2.0 technologies for crisis response while planning for Web 3.0 in the near future.
DS: You’ve talked about your organization and the Joint Staff in general moving to a flatter, faster and more collaborative environment based on portals, wikis, blogs and chat rooms, as opposed to traditional e-mail communications. What can you tell us about those Web 2.0 efforts?
Shields: What we want to do is move toward a more collaborative and discoverable environment. You’re familiar with the term “crowdsourcing,” where we’re collectively more intelligent than we are individually. That is a big culture move for us in the joint staff. We’re largely an e-mail-based organization so moving from e-mail to Web 2.0 technologies was a cultural shift for us. The way to get flatter faster, more discoverable, is to share information. It is about going from a culture of need to know to a culture of need to share. That’s what we’ve been working hard on this past year.
If you’re familiar with the Reed and Metcalfe laws, then you know that two telephones talking to each other is OK, but 1,000 are better in terms of sharing information and bringing in what we call “the edge.” I am not talking about Web 2.0 purely as open-source social networking, which everyone is familiar with. That is certainly happening. The Joint Chiefs of Staff chairman has his own portal and is active in Facebook, Twitter, podcasting and YouTube. We're using those type of capabilities in multiple domains, including the Unclassified [but Sensitive IP Router Network] and the Secret IP Router Network to include video teleconferencing, Web conferencing and chat. The key is determining information and workflow requirements and then making a decision on whether to collaborate and how to collaborate for what types of information, such as what's best for face-to-face, video teleconferencing, e-mail, chat, microblog, wiki, and portal.
I’m talking about us leveraging these tools for situational awareness at a precision and speed greater than our adversaries. And that is what we’re about.
DS: Is the sharing of information a technology issue or social issue?
Shields: Great question. One of our charters is to get out and look at best practices, and in my opinion, it is less about the information technology. The technology exists now. Certainly, it’s rapidly advancing. You’re familiar with Moore’s law, which addresses the increase rate of technological change. Think of information and IT as a weapons system and the network as our central nervous system. How do we adjust our IT acquisition system to keep up with the rate of change? How do you get a rapid insertion of technology when you have a game-changing capability that comes out? That is important.
It is less about IT and very much about culture. It implies empowering subordinates with authority to share information and make decisions at the lowest possible level. That's uncomfortable for hierarchical and centralized organizations where knowledge is power.
To do that, though, we need to get better at visualizing large volumes of data in useful ways, and that is something we’re working on here. How do you take the volume of structured and unstructured data and then visualize it in useful ways, in ways that human minds are wired to absorb? When you talk about collective intelligence, you talk about what humans synthesize well and what machines synthesize well. How do you increase that understanding when you put the two together? You’re talking about human factors and the engineering of human/machine interfaces to improve that piece.
DS: One of the world’s most recent crises was the earthquake in Haiti. How did some of the collaborative issues we’ve been discussing play out there?
Shields: We operate in multiple domains, and with Haiti, we have moved out of the secret domain in which we predominately operate into the unclassified domain. For example, Southern Command stood up the All Partners Access Network Web site that consolidates some of the best of social networking applications into one platform. They also employed a capability called 3D Google Earth User Defined Operational Picture for non-DOD users, academia and people on the street in Haiti to import pictures from their smart phones and to share other geospatial information to create a user-defined operational picture that is accessible through the Web. So we’ve made some great strides in the unclassified domain in creating user-generated content and sharing it in an open, collaborative way. Haiti moved us rapidly into that area, and you could say the same is happening for some of the other events that are going on as well.
DS: Collecting all that nonclassified data and information must make the job of tagging and retrieving that data all the more complicated when added to the classified data your sensors are already collecting. Is that right?
Shields: I agree. The phrase some analysts have used is “drowning in data.” Rather than get into the number of sensors, where we need to move forward is in the area of sense making. You’re talking about Web 3.0 Semantic Web — collective intelligence. You’ve got these vast amounts of structured and unstructured data including video, imagery, etc. How do you rapidly make sense of that information? And how do we leverage machine intelligence to help us? And that gets into where we need to go in the future.
You saw the New York Times article on PowerPoint. We have to migrate away from that. We need to get into a dynamic information-sharing environment where we’re updating display technologies with real-time information, as opposed to using PowerPoint. One challenge with PowerPoint is that it is not co-editing. We need to get away from sequential editing of data and get to the simultaneous co-editing of data with dynamic real-time updating.
So we need to improve knowledge visualization and capability to co-edit and create information in a collaborative real-time environment using networks that can provide fast, secure, resilient communications from multiple distributed locations, whether mobile or fixed. Emerging technologies in areas such as virtual environments, holographics, 3-D, and integration of artificial intelligence/Semantic [Web 3.0], are exciting to consider. These are all challenging endeavors, and we're a work in progress.