Share This Article with a Friend!

From Our New “What Could Go Wrong” Department

One of the important, but underreported, news stories of the year 2013 was a series of little-noted articles and news releases about scientific and technical advances in the field of robotics – most notably military robotics and drones. 

The articles all had one major theme in common; that military robots and drones were becoming “smarter,” and, here’s the important part, more autonomous.

Now 2014 has opened with a small article in Britain’s Daily Mail Online describing the next step in the Pentagon’s drone technology roadmap: Military drones equipped with stronger weapons that can make their OWN decisions during missions and drone-bombs that can hunt in 'swarms' from a mothership.

Given our present state of technology, as the Daily Mail’s Sarah Griffiths notes, “drones follow precise commands to complete a predetermined step-by-step mission, but the unmanned aircraft of the future could deviate from tasks, informed by ‘laws’ that govern their behavior, laid out in algorithms and machine learning, as well as advanced sensors.”

Now here’s an amusing, but still scary, part of the report.  

According to LiveScience contributor Erik Schechter, the Pentagon wants to reduce costs by offloading as many human tasks as possible onto machines. 

We can just hear the testimony on Capitol Hill now, “Why yes Senator, autonomous drones armed with our new energetic nanoparticle weapons will save the taxpayers lots of money, and make America safer.”

Griffiths’ report of drones that can choose to deviate from a set mission and hunt in ‘swarms’ and the idea that such machines could soon be patrolling America’s skies begs an obvious question: “What could go wrong with that?”

You don’t have to be a fan of science fiction writer Fred Saberhagen’s Berserker series of short stories and novels about robotic self-replicating war machines programmed to destroy all life or Isaac Asimov’s “I, Robot” short stories to recognize that, even to a machine, laws are made to be broken, especially if that machine is given the autonomy to choose a better target than the one assigned to it by its human controllers.

Beyond the very real possibility that such a machine might turn on its creators there is another very practical problem, what if the machine decides to do something that is an act of war?

As things stand now those countries where we operate armed drones have protested, but have not yet claimed that such attacks constitute an act of war, because they don’t like the bad guys we are killing any more than we do, and the drone attacks have generally occurred in remote areas.

But what if the drone decided to hunt in a more target rich environment, say downtown Karachi, Pakistan or Teheran, Iran?

Under our Constitution only Congress can declare war. So, as much as the Pentagon would like to “offload” to a machine the human task of debating the pros and cons of an attack that would amount to a declaration of war, that shouldn’t happen under our Constitution. 

Likewise, “offloading” to a machine the choice to kill a target, if that target is an American citizen, is a violation of the Fifth Amendment’s due process clause that no American may be “deprived of life, liberty, or property, without due process of law…”

Substituting the judgment of a machine for the judgment of a jury of one’s peers in what amounts to a death penalty case, with no appeal, opens a whole new and frightening front in the battle to maintain liberty under the Constitution in this country.

The notion that an armed, autonomous drone equals “scientific progress” reminds us all too much of the celebrations of the technical achievements involved in the NSA’s as yet uncharted ability to eavesdrop on virtually any form of communication.

Yes, it represents an astonishing level of scientific and technical knowhow, but is it constitutional? 

We are extremely skeptical that armed autonomous drones will save money or make America safer. Until the scientific and technical geniuses behind the idea of autonomous armed drones also equip them with a foolproof “Constitution chip” we remain opponents of the deployment of such technology.

Share this

The goverment wants Drones. They can not even make a web page!

Remember 50% of all computer programmers are below average.

I have been in the computer software business for over 50 years and the following numbers are very optimistic.

The known average of errors per 30,000 lines of computer code items is 3.3

The results of bugs may be extremely serious. Bugs in the code controlling the Therac-25 radiation therapy machine were directly responsible for some patient deaths in the 1980s. In 1996, the European Space Agency's US$1 billion prototype Ariane 5 rocket was destroyed less than a minute after launch, due to a bug in the on-board guidance computer program. In June 1994, a Royal Air Force Chinook crashed into the Mull of Kintyre, killing 29. This was initially dismissed as pilot error, but an investigation by Computer Weekly uncovered sufficient evidence to convince a House of Lords inquiry that it may have been caused by a software bug in the aircraft's engine control computer. And the recent SF Boeing crash.

In 2002, a study commissioned by the US Department of Commerce' National Institute of Standards and Technology concluded that "software bugs, or errors, are so prevalent and so detrimental that they cost the US economy an estimated $59 billion annually, or about 0.6 percent of the gross domestic product.

And the government want drones?