Showing posts with label big brother. Show all posts
Showing posts with label big brother. Show all posts

Building America’s secret surveillance state

By James Bamford, Reuters

“God we trust,” goes an old National Security Agency joke. “All others we monitor.
Given the revelations last week about the NSA’s domestic spying activities, the saying seems more prophecy than humor.

US law permits pre-emptive cyber strikes


AFP
A secret legal review has concluded the US president has the power to order pre-emptive cyber strikes if the US discovers credible evidence of a major digital attack against it is in the offing.

Big Brother or peeping tom? UK installs CCTV in school bathrooms, changing rooms

RT, 12 September 2012
Over 200 UK state schools have installed cameras in bathrooms and changing rooms to monitor students, a recent surveillance survey reported. British parents will likely be shocked by the study’s findings.
The survey is based on a freedom of information request conducted by Big Brother Watch, an anti-surveillance activist group. The group said they were shaken by the results, which was much higher and more extensive than expected.
The report “will come as a shock to many parents”, Nick Pickles, Director of Big Brother Watch said. “Schools need to come clean about why they are using these cameras and what is happening to the footage”.
- 47,806 cameras used in 2,107 schools
- 207 schools have 825 cameras in changing rooms and bathrooms
- 90% of schools use CCTV cameras
- 54 UK schools have 1 camera or more per 15 pupils
- 106,710 CCTV cameras estimated in high schools and academies in England, Scotland and Wales
A total of 825 cameras were installed in the bathrooms and changing rooms of 207 different schools across England, Scotland and Wales, according to data provided by more than 2,000 schools.
It is unknown where the cameras are located in the bathrooms and changing rooms, who is watching the security footage and whether any pupils were recorded while changing.
Video recording in toilets or changing rooms is legal, but recommended only for exceptional circumstances, the Information Commissioner’s Office (ICO) reported. The ICO is an independent authority in the UK, whose duties include promoting privacy.
Research also showed that the extent of CCTV use varied widely from school to school. “With some schools seeing a ratio of one camera for every five pupils,” the report said.” CCTV appears to be used as a quick fix to much more complex problems and issues that simply cannot be solved with passive surveillance.”
Since the 1990s, the UK’s Home Office has spent 78 percent of its crime prevention budget on CCTV installations, and schools have likewise invested significant resources in their own surveillance equipment, the Big Brother Watch report said.
No significant research has been done into whether CCTV cameras actually lower crime rates.
Big Brother Watch was able to locate a single study by the French Institut D’ Aménagement Et D’Urbanisme, which concluded that theft and burglary continued to increase after the 2007 installation of CCTV in the Île-de-France region. A marginal reduction in disorderly incidents in schools was also reported.
The Big Brother Watch report estimated that more than 100,000 cameras monitor students and teachers across Britain, with 90 percent of the schools surveyed acknowledging the use of some form of video surveillance.
Responses from 2,107 secondary schools showed that they used 47,806 cameras in total with more than half installed inside the schools. The Radclyffe School in Oldham surpassed all other schools in the survey, with 20 cameras total in bathrooms and changing rooms.
Sharon Holder, the GMB’s national officer, told Newsvine that her trade union was disgusted with the findings.
“Placing CCTV in school bathrooms poses a worrying development in school policy and raises a number of questions,” she said. “How many parents have given headteachers permission to film their child going to the toilet or having a shower? What happens to the film afterwards? How much discussion has there been on governing bodies and to what extent have councils and councilors had any input into these developments? What problems are the schools trying to solve?”

The new totalitarianism of surveillance technology


Naomi Wolf, The Guardian
 If you think that 24/7 tracking of citizens by biometric recognition systems is paranoid fantasy, just read the industry newsletters
A software engineer in my Facebook community wrote recently about his outrage that when he visited Disneyland, and went on a ride, the theme park offered him the photo of himself and his girlfriend to buy—with his credit card information already linked to it. He noted that he had never entered his name or information into anything at the theme park, or indicated that he wanted a photo, or alerted the humans at the ride to who he and his girlfriend were—so, he said, based on his professional experience, the system had to be using facial recognition technology. He had never signed an agreement allowing them to do so, and he declared that this use was illegal. He also claimed that Disney had recently shared data from facial-recognition technology with the United States military.
Yes, I know: it sounds like a paranoid rant.
Except that it turned out to be true. News21, supported by the Carnegie and Knight foundations, reports that Disney sites are indeed controlled by face-recognition technology, that the military is interested in the technology, and that the face-recognition contractor, Identix, has contracts with the US government—for technology that identifies individuals in a crowd.
Fast forward: after the Occupy crackdowns, I noted that odd-looking CCTVs had started to appear, attached to lampposts, in public venues in Manhattan where the small but unbowed remnants of Occupy congregated: there was one in Union Square, right in front of their encampment. I reported here on my experience of witnessing a white van marked “Indiana Energy” that was lifting workers up to the lampposts all around Union Square, and installing a type of camera. When I asked the workers what was happening—and why an Indiana company was dealing with New York City civic infrastructure, which would certainly raise questions—I was told: “I’m a contractor. Talk to ConEd.”
I then noticed, some months later, that these bizarre camera/lights had been installed not only all around Union Square but also around Washington Square Park. I posted a photo I took of them, and asked: “What is this?” Commentators who had lived in China said that they were the same camera/streetlight combinations that are mounted around public places in China. These are enabled for facial recognition technology, which allows police to watch video that is tagged to individuals, in real time. When too many people congregate, they can be dispersed and intimidated simply by the risk of being identified—before dissent can coalesce. (Another of my Facebook commentators said that such lamppost cameras had been installed in Michigan, and that they barked “Obey”, at pedestrians. This, too, sounded highly implausible—until this week in Richmond, British Columbia, near the Vancouver airport, when I was startled as the lamppost in the intersection started talking to me—in this case, instructing me on how to cross (as though I were blind or partially sighted).
Finally, last week, New York Mayor Michael Bloomberg joined NYPD Commissioner Ray Kelly to unveil a major new police surveillance infrastructure, developed by Microsoft. The Domain Awareness System links existing police databases with live video feeds, including cameras using vehicle license plate recognition software. No mention was made of whether the system plans to use—or already uses—facial recognition software. But, at present, there is no law to prevent US government and law enforcement agencies from building facial recognition databases.
And we know from industry newsletters that the US military, law enforcement, and the department of homeland security are betting heavily on facial recognition technology. As PC World notes, Facebook itself is a market leader in the technology—but military and security agencies are close behind.
According to Homeland Security Newswire, billions of dollars are being invested in the development and manufacture of various biometric technologies capable of detecting and identifying anyone, anywhere in the world—via iris-scanning systems, already in use; foot-scanning technology (really); voice pattern ID software, and so on.
What is very obvious is that this technology will not be applied merely to people under arrest, or to people under surveillance in accordance with the fourth amendment (suspects in possible terrorist plots or other potential crimes, after law enforcement agents have already obtained a warrant from a magistrate). No, the “targets” here are me and you: everyone, all of the time. In the name of “national security”, the capacity is being built to identify, track and document any citizen constantly and continuously.
The revealing boosterism of a trade magazine like Homeland Security Newswire envisions endless profits for the surveillance industry, in a society where your TV is spying on you, a billboard you drive by recognizes you, Minority Report style, and the FBI knows where to find your tattoo—before you have committed any crime: “FBI on Track to Book Faces, Scars, Tattoos”, it notes; “Billboards, TVs Detect your Faces; Advertisers Salivate”, it gloats; “Biometric Companies See Government as the Driver of Future Market Growth”, it announces. Indeed, the article admits without a blush that all the growth is expected to be in government consumption, with “no real expectation” of private-sector growth at all. So much for smaller government!
To acclimate their populations to this brave new world of invasive surveillance technologies, UK Prime Minister David Cameron and his Canadian counterpart, Stephen Harper, both recently introduced “snoop” bills. Meanwhile, in the US—“the land of the free”—the onward march of the surveillers continues apace, without check or consultation.

Phone and email records to be stored in new spy plan

By David Barrett, Daily Telegraph
Details of every phone call and text message, email traffic and websites visited online are to be stored in a series of vast databases under new Government anti-terror plans.
Landline and mobile phone companies and broadband providers will be ordered to store the data for a year and make it available to the security services under the scheme.
The databases would not record the contents of calls, texts or emails but the numbers or email addresses of who they are sent and received by.
For the first time, the security services will have widespread access to information about who has been communicating with each other on social networking sites such as Facebook.
Direct messages between subscribers to websites such as Twitter would also be stored, as well as communications between players in online video games.
The Home Office is understood to have begun negotiations with internet companies in the last two months over the plan, which could be officially announced as early as May.
It is certain to cause controversy over civil liberties—but also raise concerns over the security of the records.
Access to such information would be highly prized by hackers and could be exploited to send spam email and texts. Details of which websites people visit could also be exploited for commercial gain.
The plan has been drawn up on the advice of MI5, the home security service, MI6, which operates abroad, and GCHQ, the Government’s “listening post” responsible for monitoring communications.
Rather than the Government holding the information centrally, companies including BT, Sky, Virgin Media, Vodafone and O2 would have to keep the records themselves.
Under the scheme the security services would be granted “real time” access to phone and internet records of people they want to put under surveillance, as well as the ability to reconstruct their movements through the information stored in the databases.
The system would track “who, when and where” of each message, allowing extremely close surveillance.
Mobile phone records of calls and texts show within yards where a call was made or a message was sent, while emails and internet browsing histories can be matched to a computer’s “IP address”, which can be used to locate where it was sent.
Jim Killock, executive director of the Open Rights Group, a civil liberties campaign organisation, said: “This would be a systematic effort to spy on all of our digital communications.
“The Conservatives and Liberal Democrats started their government with a big pledge to roll back the surveillance state.
“No state in history has been able to gather the level of information proposed—it’s a way of collecting everything about who we talk to just in case something turns up.”
Gus Hosein, of Privacy International, said: “This will be ripe for hacking. Every hacker, every malicious threat, every foreign government is going to want access to this.
“And if communications providers have a government mandate to start collecting this information they will be incredibly tempted to start monitoring this data themselves so they can compete with Google and Facebook.”
He added: “The internet companies will be told to store who you are friends with and interact with. While this may appear innocuous it requires the active interception of every single communication you make, and this has never been done in a democratic society.”

Here you can buy all the tools you need to spy on, oppress and control your citizens, enemies or other victims


Trade in surveillance technology raises worries
By Sari Horwitz, Shyamantha Asokan and Julie Tate, Washington Post, December 1, 2011

Northern Virginia technology entrepreneur Jerry Lucas hosted his first trade show for makers of surveillance gear at the McLean Hilton in May 2002. Thirty-five people attended.
Nine years later, Lucas holds five events annually across the world, drawing hundreds of vendors and thousands of potential buyers for an industry that he estimates sells $5 billion of the latest tracking, monitoring and eavesdropping technology each year. Along the way these events have earned an evocative nickname: The Wiretappers’ Ball.
The products of what Lucas calls the “lawful intercept” industry are developed mainly in Western nations such as the United States but are sold throughout the world with few restrictions. This burgeoning trade has alarmed human rights activists and privacy advocates, who call for greater regulation because the technology has ended up in the hands of repressive governments such as those of Syria, Iran and China.
“You need two things for a dictatorship to survive—propaganda and secret police,” said Rep. Christopher H. Smith (R-N.J.), who has proposed bills to restrict the sale of surveillance technology overseas. “Both of those are enabled in a huge way by the high-tech companies involved.”
But the overwhelming U.S. government response has been to engage in the event not as a potential regulator, but as a customer.
The list of attendees for this year’s U.S. Wiretappers’ Ball, held in October at the North Bethesda Marriott Hotel and Conference Center, included more than 20 federal agencies, Lucas said. Representatives of 43 countries also were there, he said, as were many people from state and local law enforcement agencies. Journalists and members of the public were excluded.
On offer were products that allow users to track hundreds of cell phones at once, read e-mails by the tens of thousands, even get a computer to snap a picture of its owner and send the image to police—or anyone else who buys the software. One product uses phony updates for iTunes and other popular programs to take control of personal computers.
Industry officials say their products are designed for legitimate purposes, such as tracking terrorists, investigating crimes and allowing employers to block pornographic and other restricted Web sites at their offices.
U.S. law generally requires law enforcement agencies to obtain court orders when intercepting domestic Internet or phone communications. But such restrictions do not follow products when they are sold overseas.
“This technology is absolutely vital for civilization,” said Lucas, president of TeleStrategies, which hosts the events, officially called Intelligent Support Systems World Conferences. “You can’t have a situation where bad guys can communicate and you bar interception.”
But the surveillance products themselves make no distinction between bad guys and good guys, only users and targets. Several years of industry sales brochures provided to The Washington Post by the anti-secrecy group WikiLeaks, and released publicly Thursday, reveal that many companies are selling sophisticated tools capable of going far beyond conventional investigative techniques.
“People are morally outraged by the traditional arms trade, but they don’t realize that the sale of software and equipment that allows oppressive regimes to monitor the movements, communications and Internet activity of entire populations is just as dangerous,” said Eric King of Privacy International, a London-based group that seeks to limit government surveillance. Sophisticated surveillance technology “is facilitating detention, torture and execution,” he said, “and potentially smothering the flames of another Arab Spring.”
Demand for surveillance tools surged after the Sept. 11, 2001, attacks, as rising security concerns coincided with the spread of cellphones, Skype, social media and other technologies that made it easier for people to communicate—and easier for governments and companies to eavesdrop on a mass scale.
The surveillance industry conferences are in Prague, Dubai, Brasilia, the Washington area and Kuala Lumpur, whose event starts Tuesday. The most popular conference, with about 1,300 attendees, was in Dubai this year. Middle Eastern governments, for whom the Arab Spring was “a wake-up call,” are the most avid buyers of surveillance software and equipment, Lucas said. Any customers who come to the event are free to buy the products there.
“When you’re selling to a government, you lose control of what the government is going to do with it,” Lucas said. “It’s like selling guns to people. Some are going to defend themselves. Some are going to commit crimes.”
The brochures collected by WikiLeaks make clear that few forms of electronic communication are beyond the reach of available surveillance tools. Although some simple products cost just a few hundred dollars and can be purchased on eBay or Amazon, the technology sold at the trade shows often costs hundreds of thousands or millions of dollars. Customization and on-site training can provide years of revenue for companies.
One German company, DigiTask, offers a suitcase-sized device capable of monitoring the Web traffic of users at public WiFi hotspots such as cafes, airports and hotel lobbies. A lawyer representing the company, Winfried Seibert, declined to elaborate on its products.
The FinFisher program, which creates fake updates for iTunes, Adobe Acrobat and other programs, was produced by a British company, Gamma International. The Wall Street Journal reported on this product, and several other surveillance tools described in sales brochures, in an article last month. Apple said it altered iTunes to block FinFisher intrusions Nov. 14.
A Gamma spokesman, Peter Lloyd, said that FinFisher is a vital investigative tool for law enforcement agencies and that the company complies with British law. “Gamma does not approve or encourage any misuse of its products and is not aware of any such misuse,” he said.
The WikiLeaks documents, which the group also provided to several European news organizations and one in India, do not reveal the names of buyers. But when “Arab Spring” revolutionaries took control of state security agencies in Tunisia, Egypt and Libya, they found that Western surveillance technology had been used to monitor political activists.
“We are seeing a growing number of repressive regimes get hold of the latest, greatest Western technologies and use them to spy on their own citizens for the purpose of quashing peaceful political dissent or even information that would allow citizens to know what is happening in their communities,” said Michael Posner, assistant secretary of state for human rights, in a speech last month in California.
The spread of such technology is not limited to the Middle East. A federal lawsuit filed in May accuses Cisco Systems, a Silicon Valley company, of helping China monitor the Falun Gong spiritual group.
The lawsuit, filed by the U.S.-based Human Rights Law Foundation, alleges that Cisco helped design and provide equipment for China’s “Golden Shield,” a firewall that censors the Internet and tracks government opponents. Cisco has acknowledged that it sells routers, which are standard building blocks for any Internet connection, to China. But it denies the allegations in the suit.
A State Department official was pessimistic that government regulation could curb a fast-changing technology sector. “We’ve lost,” said the official, who spoke on the condition of anonymity. “If the technology people are selling at these conferences gets into the hands of bad people, all we can do is raise the costs. We can’t completely protect activists or anyone from this.”

Remote control warfare kills "only" 9 inocent for every enemy


Unmanned drone attacks and shape-shifting robots: War’s remote-control future
By Anna Mulrine, Christian Science Monitor

The development of a new generation of military robots, including armed drones, may eventually mark one of the biggest revolutions in warfare in generations. Throughout history, from the crossbow to the cannon to the aircraft carrier, one weapon has supplanted another as nations have strived to create increasingly lethal means of allowing armies to project power from afar.
But many of the new emerging technologies promise not only firepower but also the ability to do something else: reduce the number of soldiers needed in war. While few are suggesting armies made up exclusively of automated machines (yet), the increased use of drones in Afghanistan and Pakistan has already reinforced the view among many policymakers and Pentagon planners that the United States can carry out effective military operations by relying largely on UAVs, targeted cruise missile strikes, and a relatively small number of special operations forces.
At the least, many enthusiasts see the new high-tech tools helping to save American lives. At the most, they see them changing the nature of war—how it’s fought and how much it might cost—as well as helping America maintain its military preeminence.
Yet the prospect of a military less reliant on soldiers and more on “push button” technologies also raises profound ethical and moral questions. Will drones controlled by pilots thousands of miles away, as many of them are now, reduce war to an antiseptic video game? Will the US be more likely to wage war if doing so does not risk American lives? And what of the oversight role of Congress in a world of more remote-control weapons? Already, when lawmakers on Capitol Hill accused the Obama administration of circumventing their authority in waging war in Libya, White House lawyers argued in essence that an operation can’t be considered war if there are no troops on the ground—and, as a result, does not require the permission of Congress.
“If the military continues to reduce the human cost of waging war,” says Lt. Col. Edward Barrett, an ethicist at the US Naval Academy in Annapolis, Md., “there’s a possibility that you’re not going to try hard enough to avoid it.”
Beneath a new moon, a crew pushes the 2,500-pound Predator drone toward a blacked-out flight line and prepares it for takeoff. The soldiers wheel over a pallet of Hellfire missiles and load them onto the plane’s undercarriage. The Predator pilot walks around the aircraft, conducting his preflight check. He then returns to a nearby trailer, sits down at a console with joysticks and monitors, and guides the snub-nosed plane down the runway and into the night air—unmanned and fully armed.
The takeoffs of Predators with metronome regularity here at Kandahar Air Field, in southern Afghanistan, has helped turn this strip of asphalt into what the Pentagon calls the single busiest runway in the world. An aircraft lifts off or lands every two minutes. It’s a reminder of how integral drones have become to the war in Afghanistan and the broader war on terror.
Initially, of course, the plan was not to put weapons on Predator drones at all. Like the first military airplanes, they were to be used just for surveillance. As the war in Iraq progressed, however, US service members jury-rigged the drones with weapons. Today, armed Predators and their larger offspring, Reapers, fly over America’s battlefields, equipped with both missiles and powerful cameras, becoming the most widely used and, arguably, most important tools in the US arsenal.
Since first being introduced in Iraq and Afghanistan, their numbers have grown from 167 in 2002 to more than 7,000 today. The US Air Force is now recruiting more UAV pilots than traditional ones.
“The demand has just absolutely skyrocketed,” says the commander of the Air Force’s 451st Operations Group, which runs Predator and Reaper operations in Kandahar.
As their numbers have grown, so has the sophistication with which the military uses them. The earliest drones operated more as independent assets—as aerial eyes that sent back intelligence and dropped their bombs. But today the unmanned aircraft are integrated into almost every operation on the ground, acting as advanced scouts and omniscient surveyors of battle zones. They monitor the precise movements of insurgents and kill enemy leaders. They conduct “virtual lineups,” zooming in powerful cameras to help determine whether a suspected insurgent may have carried out a particular attack.
“A lot of the ground commanders won’t execute a mission without us,” says the Air Force’s commander of the 62nd Expeditionary Reconnaissance Squadron in Afghanistan.
Robots, too, have become a far more pervasive presence on America’s fields of battle. Remote-control machines that move about on wheels and tracks scour for roadside bombs in Iraq and Afghanistan. Soldiers in the mountains of eastern Afghanistan carry hand-held drones in backpacks, which they assemble and throw into the air to scope out terrain and check for enemy fighters. In the past 10 years, the Pentagon’s use of robots has grown from zero to some 12,000 in war zones today.
Part of the exponential rise in the use of UAVs and robots stems from a confluence of events: improvements in technology and America’s prolonged involvement in two simultaneous wars.
There is, too, the prospect of more money for military contractors eyeing a downturn in future defense budgets. Today, the amount of money being spent on research for military robotics surpasses the budget of the National Science Foundation, which, at $6.9 billion a year, funds nearly one-quarter of all federally supported scientific research at the nation’s universities.
Military officials also see in the new technologies the possibility of savings in an era of shrinking budgets. Deploying forces overseas can now cost as much as $1 million a year per soldier.
Yet the biggest allure of the new high-tech armaments may be something as old as conflict itself: the desire to reduce the number of casualties on the battlefield and gain a strategic advantage over the enemy. As Lt. Gen. Richard Lynch, a commander in Iraq, observed at a conference on military robotics in Washington earlier this year: “When I look at the 153 soldiers who paid the ultimate sacrifice [under my command], I know that 80 percent of them were put in a situation where we could have placed an unmanned system in the same job.”
Drones, in particular, seem the epitome of risk-free warfare for the nation using them—there are, after all, no pilots to shoot down. Moreover, the people who run them are often nowhere near the field of battle. Some 90 percent of the UAV operations over Afghanistan are flown by people in trailers in the deserts of Nevada. In Kandahar, soldiers help the planes take off and land and then hand over controls to the airmen in the US.
“We want to minimize the [human] footprint as much as possible,” says the 451st Operations Group commander at the Kandahar airfield, where the effects of being close to the war are clearly visible: The plywood walls of the tactical operations center are lined with framed bits of jagged metal from mortars that have fallen on the airfield over the years.
While the distant control of drones may well protect American lives, it raises questions about what it means to have people so far removed from the field of conflict. “Sometimes you felt like God hurling thunderbolts from afar,” says Lt. Col. Matt Martin, who was among the first generation of US soldiers to work with drones to wage war and who has written a book—“Predator: The Remote-Control Air War Over Iraq and Afghanistan: A Pilot’s Story.”
Martin agrees that the unmanned aircraft no doubt reduce American casualties, but wonders if it makes killing “too easy, too tempting, too much like simulated combat, like the computer game Civilization.”
It probably doesn’t reassure critics that the flight controls for drones over the years have come to resemble video-game contollers, which the military has done to make them more intuitive for a generation of young soldiers raised on games like Gears of War and Killzone.
Martin knows what it’s like to confront the dark side of war, even as he fought it from afar. During one operation, he was piloting a drone that was tracking an insurgent. Just after he fired one of the aircraft’s missiles, two children rode their bicycles into range. They were both killed. “You get good at compartmentalizing,” says Martin.
What worries critics is those who are too good at it—and the impact in general of waging war at a distance. Some fret about the mechanics of the decisionmaking process: Who ultimately makes the decision to pull the trigger? And how do you decide whom to put on the hit list—a top Al Qaeda official, yes, but is some petty but persistent insurgent a matter of national security?
As the US increasingly uses drones in its secret campaigns, questions arise about how much to inform America’s allies about UAV attacks and whether they alienate local populations more than they help subdue the enemy, which the US has starkly, and almost weekly, confronted with its drone campaign in Pakistan.
From the US military’s viewpoint, the drone war has been fantastically successful, helping to kill key Al Qaeda operatives and Taliban insurgents with a minimum of civilian casualties and almost no US troops put at risk.
Some even believe that the ethical oversight of drones is far more rigorous than that of manned aircraft, since at least 150 people—ground crews, engineers, pilots, intelligence analyzers—are typically involved in each UAV mission.
The issue of what’s a minimum of civilian losses is, of course, subjective. In 2009, the Brookings Institution, a Washington think tank, estimated that the US drone war was killing about 10 civilians for every 1 insurgent in Pakistan. That may be far fewer casualties than would be killed with traditional airstrikes. But it is hardly comforting to the Pakistanis.
Moreover, the very practice of taking out enemy leaders or sympathizers could at some point, according to detractors, devolve into an aerial assassination campaign. When the US used a drone strike last month to kill jihadist cleric and American-born Anwar al-Awlaki in Yemen, President Obama hailed it as a “major blow” to Al Qaeda in the Arabian Peninsula. But some critics decried the killing of a US citizen with no public scrutiny.
Barrett, who is the director of research at the Naval Academy’s Stockdale Center for Ethical Leadership, discusses with his students the prospect of whether UAVs make it easier to wage war if the government doesn’t have to worry about a public outcry. “There are not the mass numbers of troops moving around and visible, so it could be easier to circumvent the oversight of Congress and, therefore, legitimate authority,” he notes.
Others ask a more simple but practical question: What about the troops who conduct the UAV strikes from the Nevada desert—could they become legitimate targets of America’s enemies at, say, a local mall, bringing the war on terror to the suburbs?
Some worry that the US is, in fact, placing too heavy a burden on its UAV troops. Despite warnings that “video-game warfare” might make them callous to killing, new studies suggest that the stress levels drone operators face are higher than those for infantry forces on the ground.
“Having this idea of a ‘surgical war’ where you can really just pinpoint the bad guys with the least amount of damage to our own force, there’s a bit of naiveté in all that,” says Maryann Cusinamo Love, an associate professor at Catholic University of America in Washington, D.C.
She says the powerful cameras on the drones allow pilots to see in “great vivid detail the real-time results of their actions. That is an incredible stress on them.”
It is also, she argues, a “ghettoization of the killing function in war.” However justified the military mission may be, she says, “You are still giving the most stressful job of war disproportionately to this one subset of people.”
With the advent of the US wars in Iraq and Afghanistan, however, technology has once again rendezvoused with military necessity. A company called iRobot in Bedford, Mass., sent a prototype of its PackBot, which soldiers began using to clear caves and bunkers suspected of being mined. When the testing period was over, “The Army unit didn’t want to give the test robot back,” Mr. Singer notes.
While the use of robots that can detect and defuse explosives is growing exponentially, the next big frontier for America’s military R2-D2s may parallel what happened to drones: They may be fitted with weapons—offering new fighting capabilities as well as raising new concerns.
Already, researchers are experimenting with attaching machine guns to robots that can be triggered remotely. Field tests in Iraq for one of the first weaponized robots, dubbed SWORDS, didn’t go well.
“There were several instances of noncommanded firing of the system during testing,” says Jef­frey Jaczkowski, deputy manager of the US Army’s Robotic Systems Joint Project Office.
Though US military officials tend to emphasize that troops must remain “in the loop” as robots or drones are weaponized, there remains a strong push for automation coming from the Pentagon. In 2007, the US Army sent out a request for proposals calling for robots with “fully autonomous engagement without human intervention.” In other words, the ability to shoot on their own.
At the Georgia Institute of Technology in Atlanta, Ronald Arkin is researching a stunning premise: whether robots can be created that treat humans on the battlefield better than human soldiers treat each other. He has pored over the first study of US soldiers returning from the Iraq war, a 2006 US Surgeon General’s report that asked troops to evaluate their own ethical behavior and that of their comrades.
He was struck by “the incredibly high level of atrocities that are witnessed, committed, or abetted by soldiers.” Modern warfare has not lessened the impact on soldiers. It is as stressful as ancient hand-to-hand combat with axes, he argues, because of the sorts of quick decisions that fighting with modern technology requires.
“Human beings have never been designed to operate under the combat conditions of today,” he says. “There are many, many problems with the speed with which we are killing right now—and that exacerbates the potential for violation of laws of war.”
With Pentagon funding, Dr. Arkin is looking at whether it is possible to build robots that behave more ethically than humans—to not be tempted to shoot someone, for instance, out of fear or revenge.
The key, he says, is that the robot should “first do no harm, rather than ‘shoot first, ask questions later.’”
Other research into armed robots centers not so much on outperforming humans as being able to work with them. In the not-too-distant future, military officials envision soldiers and robots teaming up in the field, with the troops able to communicate with machines the way they would with a human squad team member. Eventually, says Thompson, the robot-soldier relationship could become even more collaborative, with one human soldier leading many armed robots.
After that, the scenarios start to become something more out of the realm of film studios. For instance, retired Navy Capt. Robert Moses, president of iRobot’s government and industrial relations division, can envision the day of humanless battlefields.
“I think the first thing to do is to go ahead and have the Army get comfortable with the robot,” he says. One day, though, “you could write a scenario where you have an unmanned battle space—a ‘Star Wars’ approach.”
These developments raise questions that ethicists are just beginning to unravel. This includes Peter Asaro, who last year formed the International Committee for Robot Arms Control. He’s grappling with conundrums like: What, to a machine, counts as “about to shoot me?” How does a robot make a distinction between a dog, a man, and a child? How does it tell an enemy from a friend?
Such things are not entirely abstract. An automated “sentry robot” now stands guard in the demilitarized zone between North and South Korea, equipped with heat, voice, and motion sensors, as well as a 5 mm machine gun. What if it starts firing, accidentally or otherwise?
In the end, the emerging era of remote-control warfare—like evolutions in warfare throughout history—will likely create profound new capabilities as well as profound new problems for the US. The key will be to minimize the one over the other.