Privacy is one of those squishy issues that government has typically not been very good with. It usually gets conflated with data security, but in fact it’s a very different animal. Security is about protecting data from prying eyes, whereas privacy is about protecting people from the effects of information being seen by those prying eyes. And what is privacy to one government bureaucrat is just an annoyance to another. As I said, squishy.
A recent New York Times blog post points to the silence of the Obama Administration on privacy, at least so far, compared to the very loud pronouncements it’s made on such things as cybersecurity. One of the theories proffered for this was the dichotomy the administration is facing between the need for privacy on the one hand and the increasingly privacy-deficient world of Web 2.0, which the administration champions.
Interesting, therefore, to see that the boffins at DARPA have come up with a set of privacy principles that, at least at first glance, seem to give some hope that privacy could become a part of fundamental technology R&D. In other words, privacy would no longer be an afterthought of technology, or something to be layered on top, but the technology itself would be defined by the potential impact it will have on privacy.
As DARPA itself points out, the administration’s recent National Security Strategy lays out some of the criteria for privacy protection, though really it only mentions it in passing as a part of its rhetoric on cybersecurity, without really defining what privacy is or how it should be protected.
Which, if it can really turn its privacy principles into something concrete, is why DARPA’s approach seems potentially so important. It’s going to get the National Academy of Sciences to conduct a study on the ethical implications of technology advances, create an independent privacy review panel to tell it what effect its bleeding edge R&D is likely to have on privacy and work with the National Science Foundation to assess the dangers to “personally identifiable information” of science and technology development.
That could all turn out to be as squishy as the original issue, of course. The trick is how it’s implemented into DARPA’s work. If it’s successful, that could turn out to be as equally a high-payoff development as any of the actual technology it produces.
Posted on Aug 10, 2010 at 9:28 PM0 comments
Most of the reaction to the dumping of classified Afghan war documents at the WikiLeaks whistleblower Web site has so far focused on the broader damage the release might cause to the U.S. in that fight. But might the affair also impact the future ability of frontline troops to do their job?
A story published in MIT’s Technology Review implies that it could. The story said the WikiLeaks data dump was made possible by recent efforts in the military to deliver the freshest possible intelligence to frontline fighters. A probable restriction on the distribution of that material in the future could throttle the flow of potentially lifesaving information to those soldiers.
Technology Review links this to a clampdown on information flowing across DOD’s Secret IP Router Network (SIPRNet), which carries secret information and was presumably where Bradley Manning, the Army private and intelligence analyst who has been charged with an earlier release of documents to WikiLeaks and could be implicated in the latest Afghan affair, got much of that data.
The problem for the military is not only that up-to-date intelligence is becoming more vital for warfighters, particularly in the urban environments they are now fighting in, but it’s central to the way the military intends to use technology to fight wars in the future. Army Chief Information Officer Lt. Gen. Jeffrey Sorenson says the Army is “net-dependent” in carrying out its mission, and Air Force CIO Lt. Gen. William Lord says the network is crucial to making up for the reduced size of future armed forces.
The recent WikiLeaks debacle poses a big problem for DOD. The point of the network is to get good information to the warfighter as quickly as possible, while keeping the data as secure as possible. You could keep it secure by strictly limiting access to data, but what does that do to its availability?
Another Technology Review piece suggests that the bird has already flown when it comes to putting a lid on leaks. Maybe the focus should be more on tracking down leakers and prosecuting them, leaving the data to flow to where it will do the most good. Maybe we also need a rethink what secret and secure really means these days.
Posted on Aug 06, 2010 at 9:28 PM0 comments
I don’t know about you, but at our house everything goes dark at bedtime. I’m pretty good about turning the lights off, but my wife pulls the plug on everything that can connect to the power grid so that we can save on the last possible cent before it dribbles away into the power company’s coffers.
Power IT Down Day is an attempt by a group of companies to convince the government IT universe to do the same. Despite all the push for green IT, apparently agencies are still pretty wasteful with their power consumption.
There have been mandates for agencies to reduce their energy use, there’s a goal to cut government energy use and emissions by 28 percent by 2020, and the Obama Administration earlier this year launched its GreenGov Challenge. Given that the federal government accounts for around 1.5 percent of the total U.S. annual energy consumption, and is the largest user, just a little savings could add up to a lot of green.
The trouble is getting this message down into the trenches, or in this case down to the regular government employee at his or her desk or workplace. It doesn’t matter what the muckety-mucks in the White House or on the upper floors at agency headquarters dream up: If those employees aren’t convinced to turn their computers, printers or other devices off when they leave at night, then it’s just talk.
Most of the efforts around green IT have so far centered around the data center and other large energy users. But the largest user population, those everyday government grunts, probably have the greatest effect on energy consumption but haven’t been the focus of anything much. Hence, Power IT Down Day.
This is the third of the annual Power IT Down Days, which this year is happening on Aug. 27. The number of participants last year doubled from that of the initial event to 5,600 and achieved a total energy savings of around 73,000 kWh. The Wounded Warrior Project got $45,000 as a result.
This year, the event’s sponsors -- Citrix, Intel, Hewlett Packard and GTSI -- have a goal of getting 6,100 people to participate and, given that Aug. 27 is the first day of a weekend, save a total of 335,000 kWh of energy. That’s around $45,000 in hard dollar savings. And the Wounded Warrior Project gets another donation.
For the goals of the sponsor’s corporate citizenship, the intent is to be positive. Get people to sign up at the Power IT Down website to participate and, broadly, show government what the ROI is on better managing this every day power drain.
Here’s a more devious idea. On Aug. 27, get someone to drive around town with some kind of luminosity meter. I’m sure some bright spark can invent that in time. See which agency comes out as the best saver, and which the worst. The winner gets a shiny gold star it can put onto its Web home page proclaiming it the government energy champion, while the loser has to pay the winner’s energy bill for that weekend.
Who do you think will be the winner and the loser? Leave us a comment.
Posted on Jul 30, 2010 at 9:28 PM5 comments
The Commission on Cybersecurity for the 44th Presidency has published its findings on the “Human Capital Crisis in Cybersecurity” and, as earlier reports suggested, it could be the spark for a wholesale change in the way the entire government IT work force is trained and certified.
Long term, if the commission’s recommendations are accepted, the professional bona fides of those who work in software development and network operations, as well as in traditional security areas such as intrusion detection and forensics, would be decided by an independent Board of Information Security Examiners. These areas are also critical to cybersecurity, the commission believes.
The commission identified a total of nine key roles in cybersecurity many of which, as with the above, don’t usually fall under the cybersecurity umbrella, including such things as systems administrator and even technical writer. “At least for the moment,” the commission said, it’s not including “executive and leadership roles or specialized functions unique to national security, intelligence or law enforcement.”
If you read through the commission’s report, however, it wouldn’t be surprising to eventually find just about any job that touches on IT, and therefore cybersecurity, included in this list.
The push for certification of cybersecurity professionals, and along with it the definition of just who fits that bill, will be controversial, given that there are many people already involved in cybersecurity that don’t have any formal qualifications. The commission tackles that by comparing the current state of cybersecurity to the practice of medicine in the 19th Century. Likewise, it said, the cybersecurity field has “lots of often self-taught practitioners only some of whom know what they are doing.”
It goes on to say:
“What has evolved in medicine over the last century is a system that recognizes that different kinds of skills and specialties are required. And, since most of us are not able to access the qualifications of a practitioner when a need arises, we now have an education system with accreditation standards and professional certifications by specialty. We can afford no less in the world of cyber.”
Those will be fighting words to some, and there’s a widespread dislike of the idea that the government could take a lead on deciding who is and who is not a cyber professional. But given the urgency that’s building around cybersecurity
and the lack of people to fill essential roles, the commission’s recommendations will likely get a sympathetic hearing.
Posted on Jul 26, 2010 at 9:28 PM7 comments
You know something has reached a certain maturity (or at least notoriety) when sober academics conduct research on it, and it seems that Twitter has reached that point. But you may be surprised by the results.
Researchers at the University of Toronto took a look at how those Twittering fiends in the House of Representatives use the microblogging service, and concluded that it’s different depending on whether a Democrat or a Republican is at the computer keys. Democrats have transparency motives, whereas Republicans use Twitter for outreach.
Apparently the number of bills in play at any one time influenced the adoption of Twitter, especially by Republicans. The perceived benefit of using Twitter for outreach is directly related to its potential for influencing political rivals who are also on Twitter, the researchers found.
They say that “the rate of adoption is higher if a representative has sponsored a large number of bills and belongs to committees with a large proportion of Democratic Twitter adopters. The benefit associated with outreach is substantial if Twitter can be used to garner public support for certain policies, which in turn generates support from political rivals.”
No mention was made in the research about the potential influence of the current election cycle on each party’s use. One curious finding, however, was that those members who belong to a large number of committees are less likely to adopt Twitter, as are committee chairmen.
They also mention that much of the initial Twitter adoption occurred around January 2009, when new staffers started work for representatives. That could bias things, the researchers suggest, since staffers likely assist in both the initiation of bills as well as activity on Twitter. Unsaid is the influence these mainly young, and presumably more tech-savvy, staffers had on their bosses’ adoption of Twitter.
However, don’t go too wild about the influence of Twitter and all things social media, at least not just yet. A study by the American Customer Satisfaction Index says that many consumer don’t find soclal media all that satisfying. In fact Facebook, which is far more widely used than Twitter, scored lower in user satisfaction than even IRS e-filers.
What do users find the most satisfying online experience? Cable news sites such as Fox News, MSNBC.com and CNN.com, as well as those of major news outlets such as the New York Times and USA Today. That doesn’t mean to say that eyeballs are veering away from social media, though, and since politicos always go to where the numbers are, Twitter and its social brethren are probably safe.
Posted on Jul 21, 2010 at 9:28 PM0 comments
NASA has been one of the leading cloud computing lights in the federal government with its homegrown Nebula technology; now it’s making a play for the same reputation in open source cloud circles through an association with a new industry initiative called OpenStack.
Rackspace, which claims a four-year tenure as a cloud hosting company, is the main industry player, but others include Citrix and Dell. OpenStack founders hope more companies will join, including some of the current big players in the market.
The idea, apparently, is to use OpenStack to drive both public and private cloud computing in the same way that other open source endeavors such as Linux, Apache and mySQL have influenced their segments of the IT business. Arguably, computing and applications development would not be as far along as it is without those keeping proprietary elements -- e.g. Microsoft and Windows -- on their toes.
Nebula will be the cornerstone for one of OpenStack’s first projects, which is to provide code for provisioning and managing large-scale deployments of compute instances. Anyone will be able to download that code and start developing their own cloud deployments.
It will be interesting to see just how this influences the development of the cloud industry. Recent polls seem to confirm what’s been the recent trend of IT executives looking to private clouds as their major interest. That certainly seems to be the direction in which government agencies are going, given concerns over security in public clouds.
However, whether or not private clouds are actually clouds or just the traditional data centers with hosted applications jazzed up with a different name is something open to debate. According to the naysayers, real (meaning public) clouds offer far more return than the private version.
NASA has been one of the few federal agencies to embrace the notion of the public cloud, albeit in the hybrid version with hooks from its private NASA Nebula cloud to public clouds such as those offered by Amazon. Now it’s thrown its hat, or at least its technology, into the open source arena, it will be interesting to see what that will mean for government cloud development overall.
Posted on Jul 19, 2010 at 9:28 PM0 comments