'Weaponization of Code' Represents Revolution in Cyberwarfare

The U.S. should embrace cybersecurity treaties, a former State Department official says.

The United States has more to lose in a cyberattack than other nations do, and the federal government should work on negotiating treaties that govern cyberwarfare, according to former State Department official Alec Ross.

At an event Tuesday hosted by the Italian Embassy in Washington, Ross expounded on the themes of his recently published book, The Industries of the Future, and talked about the perils and promise of cybersecurity.

In the book, Ross argues that robotics, cybersecurity, the commercialization of genomics, the evolution of Big Data, and the digitization of money and markets will dominate the next decade. The book explores how these factors will affect how people work and how the economy will change as a result.

Ross, who served as senior adviser for innovation to former Secretary of State Hillary Clinton from 2009 to 2013, said, “The weaponization of code is the most significant development since the weaponization of fissile material.”

Cyberattacks Represent a New Kind of War

At the gathering, Ross noted that it’s not easy for individuals to gain access to materials like plutonium and uranium to make a nuclear weapon, but code is readily available.

There are roughly 16 billion Internet-connected devices globally today, he said, and in four years that number will skyrocket to 40 billion as numerous sensors proliferate and more devices inside governments, businesses and homes become connected via the Internet of Things. This explosion of connectivity could lead to increased risks, Ross warned.

“We’re networking a multitude of other devices, which are taking over so many of our industrial processes,” he said. “The security threat goes up exponentially as the production of data goes up exponentially.”

Unlike in traditional conflicts where it is the federal government’s responsibility to protect all citizens, Ross said, in a cyberattack it is not the government’s job to protect all companies or individuals. “We see big companies with the financial resources to protect themselves. Individuals or small and medium-size businesses have relatively little ability to protect themselves.”

Ross told the audience that the federal government may need to get more involved in providing cybersecurity protections. “I do think there is going to have to be some more public-sector reach into our individual cybersecurity lest we become less secure.”

The Case for Cybertreaties

For a long time the federal government resisted setting norms of behavior in cyberspace, Ross explained, because the United States had the most powerful digital infrastructure and cyberweapons, so “any multilateral agreement would, by its very nature, diminish or dilute our power, [and] it was perceived that it would not be in our interest.”

But the issue is that unlike with nuclear weapons, control of cyberweapons is not limited to a small number of countries, and non-state actors, including terrorist groups, can use them.

“The U.S. can be the biggest loser in cyberconflict,” Ross said, adding that the country’s intellectual property and critical infrastructure are more exposed than other nations'. “I absolutely do think it’s time to do more to begin a serious period of treaty-making in the cyber domain.

According to Ross, there are two key factors inhibiting treaties on cybersecurity. One is that former National Security Agency contractor Edward Snowden's revelations about surveillance by the NSA have damaged the federal government’s credibility to forge such treaties. The second hurdle is that while China and the United States have become a bit more aligned on cybersecurity, Russia and America have grown further apart, and Russia is a crucial player in any treaty negotiations. “It’s really hard to develop something global in nature without the biggest nations involved,” he said.

Phil Goldstein
Apr 13 2016