Latest posts


Quick Study

By Brian Robinson

View all blogs

No cyberwar? Say it ain't so, Howie!

The new White House cybersecurity czar doesn’t believe in cyberwar, preferring to see the battle over security as more of a crime-fighting and anti-espionage effort. Can’t you just hear the turf war bells ringing?

Howard Schmidt told Wired magazine at this week’s RSA conference that he considered cyberwar “a terrible metaphor” and a “terrible concept.” There are no winners in that environment, he said.

Fair enough. There’s probably plenty of people who would agree with him, but it’s hard to stop the juggernaut once it’s underway. There were also plenty of people who thought the Sept. 11, 2001 terrorist attacks should have been prosecuted as criminal offenses, and look how far they got.

If Schmidt really believes what he said – and keep your eyes open for an explanatory statement or a ‘I was misquoted’ quote – then he’s going to run into a lot of flak in Washington, because the reaping of power and influence is now dependent on future cyberwars.

The former director of national intelligence Michael McConnell in a Senate hearing recently compared the danger of cyberwar to the threat of nuclear war with the former Soviet Union, and said the U.S. would lose if it waged cyberwar today. How’s that for a set up?

But there’s other signs that the argument for cyberwar is well advanced. The Navy is talking about advancing cyber defense,  and in the military when you talk about defense, there’s also an understanding that will also involve offense. The Defense Department isn’t establishing a Cyber Command as a passive organization.

Schmidt has been around Washington before, and he certainly isn’t dumb. If he really believes there is no cyberwar, then he’s also got to know he’s tilting at a lot of already entrenched interests. And, in Washington, there’s no bigger invitation to trouble.

Posted on Mar 05, 2010 at 9:03 AM2 comments


Getting out in front of the burgeoning data deluge

Digital preservation is one of those issues that everyone thinks is obvious; it also is one that no one really talks about and, least of all, offers any solutions for.

Everyone will have yet another chance to confront the issue on April 1 when the Blue Ribbon Task Force on Sustainable Digital Preservation and Access plans to hold a symposium in Washington on “sustainable digital preservation practices.” It will include a wide range of organizations whose existence depends upon digital preservation, such as Google, as well as representatives from the publishing and movie industries. There also surely will be plenty of government involvement as well.

This isn’t a new subject, but it’s one that rarely makes the headlines. Data protection is all the rage, and producing data comes in a close second, but who ever talks about preserving it? It’s taken as given. Every now and then stories about the work of the National Archives or Library of Congress are published but never seem to make it onto the most-popular or most-read lists.

But think about it: We are now well into the exabyte-per-year era of data (1 billion gigabytes), and reports from outfits such as the International Data Corp. predict a doubling of the size of the digital universe every 18 months. Given the explosion in social media, and the coming one in online video, I’d say that’s conservative.

How do you store all of that data, let alone find ways to manage it so you can retrieve it and make use of it? Government mandates on retaining data aren’t going away, after all.

The blue ribbon panel recently came out with a report that examines the economies involved with both preserving data and making sure it can be accessed.

In an earlier article, task force member Dr. Fran Berman, director of the San Diego Supercomputer Center (SDSC) at the University of California-San Diego, talked about what’s needed to meet the data-cyberinfrastructure challenge.

Good stuff. I’m sure that and lots more will be discussed at the April 1 symposium. But will anybody be listening?

Posted on Mar 03, 2010 at 9:03 AM0 comments


A heavier government hand may shape Internet policy

Ever since the advent of the Internet back in the early 1990s the government’s philosophy has been to step back and let innovators and entrepreneurs drive the growth of the information superhighway (remember that?), but the growing complexities of the online universe are forcing it to contemplate a more activist role.

At least that’s the argument that Lawrence Strickland, the assistant secretary for communications and information at the Commerce Department, made recently for the windup to what he’s calling the age of Internet Policy 3.0.

Stages 1.0 and 2.0 went through commercialization of the Internet through growth, leading to the “social innovation” that’s the current big driver of the Web and the Internet. Now, it’s “time to respond to all the social changes being driven by the growth of the Internet”, according to Strickland, and thus the need for Internet Policy 3.0.

As Strickland sees it, everyone has much higher expectations of the Internet today than previously, as it has become the central nervous system for modern society:

“It is important not only to preserve but to enhance access to this open and dynamic medium that fosters unprecedented innovation and public participation. … [T]he Internet is not a natural park or wilderness area that should be left to nature.

“It’s more accurate to describe the Internet is an agglomeration of human actors — a large and growing social organization. There are no natural laws to guide it, and there is most certainly no self-regulating equilibrium point because this cacophony of human actors participating in this organization demands that there be rules or laws created to protect our interests,” Strickland writes.

The National Telecommunications and Information Administration (NTIA), which Strickland heads, has a slew of policy initiatives it will be tackling this year ranging from privacy to cybersecurity and governance of the Internet itself. Strickland went so far as to define a new focus for his agency – the “I” in NTIA now stands for Internet/information policy.

He also said in a throwaway line that if all of this is successful, perhaps the NITIA could become the “National Trust the Internet Administration.”

That would be a coup, because the last thing most of the Internet-Web universe would admit to is trusting government involvement. However, he promises collaboration in all of this, but only time will tell.

Posted on Mar 01, 2010 at 9:03 AM1 comments


It's put-up time for the emergency communications network

It seems the government will try again to build a nationwide public safety communications network, at least if Congress agrees with recommendations the Federal Communications Commission (FCC) will make in a couple of weeks in its highly anticipated broadband plan.

FCC chairman Julius Genachowski favors giving first responders access to the full 700 MHz band of the wireless spectrum, which he figures will allow the government to build the network at a cost of between $16 billion and $18 billion.

This is the second time in the past few years that government has tried to get such a network running. Back in 2008, the FCC auctioned off several blocks of the 700 MHz space to private industry for just under $20 billion. But the auction of so-called D block spectrum that would have gone for the emergency communications network failed, for a variety of reasons.

Genachowski apparently wants a re-auction of the D block spectrum, which would be allocated specifically for first responders. But he also wants them to be able to share the entire 700 MHz with other advanced wireless service providers through roaming and other arrangements.

It’s way past time that the United States had this kind of network. As has been pointed out, if the Haiti earthquake happened in the U.S. the lack of these communications would be catastrophic.

Genachowski makes a good point that it’s very unlikely that private industry will come up with any of the money for this network, so it’s up to the government. However, the problem there is the same as it’s been since Sept. 11, 2001 and even before: When pushed, government is reluctant to spend the money.

Everyone talks a good game and is quick to acknowledge emergency workers as the heroes they are. Let’s see if this time the put-up is there.

Posted on Feb 25, 2010 at 9:03 AM2 comments


Feds will play a big part in broadband push

The Federal Communications Commission is slated to publish its big plan for faster broadband adoption throughout the U.S. in about three weeks, but that won’t be the only federal government involvement.

For example, the FCC is expected to propose that federal buildings be used as “anchor institutions” for unserved and underserved communities. The concept is still a bit hazy, but some suggestions have the FCC acting as a kind of local ISP seed in rural and urban areas that find it tough to get broadband service.

That's a novel idea. Government buildings are assured of broadband service, and they will act as a service node in the greater broadband network. But how will the General Services Administration react to this? And what will the ISPs think about the federal government essentially becoming a competitor to them?

However, if the ISPs really cared, I guess they would already have found some way to provide true broadband service to those areas.

Other fed-centered suggestions to come out of the FCC could include better coordination of broadband grants, the release of more government information online, something that will enable people-centric online services, and a government effort to encourage more use of social media.

All neat stuff, and if it ever comes to anything, it would be potentially game-changing in many areas. As they say, the devil will be in the details. Stay tuned.

Posted on Feb 22, 2010 at 9:03 AM1 comments


To USB or not to USB

Those little USB thumb drives are very helpful little critters for transporting data easily between one computer and another, you have to admit. However, they are also very useful for introducing malware into a system.

That was that the reason the Pentagon banned their use in November 2008, declaring that “Memory sticks, thumb drives and camera flash memory cards have given the adversary the capability to exploit our poor personal practices and have provided an avenue of attack ... malicious software (malware) programmed to embed itself in memory devices has entered our systems.”

Now, it seems, USB devices are OK. US Strategic Command has lifted its ban on their use. Not necessarily because they think they are safe to use, but because it doesn’t have the support to enforce that kind of ban indefinitely, according to this Wired report.

(InsideDefense.com first reported the story, but to read it online requires a subscription).

But here’s the thing. They are still dangerous things to use. A recent report said that certain Federal Information Processing Standard-certified USB drives actually had flaws that could allow unauthorized access to encrypted data, and then we get news that the South Korean military is planning to ban USB drives because of recent Chinese hacking attacks.

I can understand maintaining a ban, but saying that you can’t police it very well. At least you are sending the message that they are not safe to use. But knowing they’re not safe and lifting the ban anyway – what message does that send?

Posted on Feb 19, 2010 at 9:03 AM6 comments


Defense Systems eNewsletters