Drones Are as Good—or Bad—as the People Who Control Them

Question Everything - Can drones be used for good?
Getty Images; Illustration by Alex Thebez for TIME

Their behaviors should be as transparent as possible

Question Everything Icon

Thousands of “drones,” small unmanned aircraft that are popular with hobbyists, are used everyday to provide new perspectives on our world. The best way to think about these vehicles is as a new form of photography, providing points of view on our neighborhoods, our news event, our cities and our natural environment. They will be able to make new kinds of maps and maybe even deliver packages. A great deal of hope surrounds their potential economic value.

Of course, when they look into our neighbors’ windows, or interfere with airliners, the effects are not beneficial. But we should remember these vehicles are human products. When we dislike their behaviors, they are human behaviors we dislike, and we should address them at that level. Even a pre-programmed or supposedly “autonomous” drone is carrying out instructions prepared by a person somewhere up the line.
[time-brightcove not-tgx=”true”]

The use of drones in warfare epitomizes this effect. These “drones” are actually remotely piloted aircraft, however far away their pilots may be. On the one hand, they can make remote observations and take remote actions that can keep people out of harm’s way. On the other hand, they can be employed by political leaders for targeted assassinations that are at odds with traditional military morality.

Moreover, they have implications for the age-old professions of fighting. Is one a true warrior if one never visits a battlefield? Is it honorable to fight from thousands of miles away? Drone operators do not put their bodies at risk, but they do suffer stresses of combat. These are effects felt and policies created by people, and we should debate them at that level. The “technology” itself does not do anything.

We also might say that modern airliners are a kind of drone. Much of the time when we fly the enroute portion of the flight (and even some landings), the autopilot is in control, and the pilots are monitoring. These drones are certainly good in that they do improve safety and relieve the pilots of tedious tasks. Yet they are not perfect, and people must be kept in the loop to monitor them.

Even in some imagined future when “drones” are driving down our highways (hopefully still with attentive drivers at the wheel), their behaviors are developed, programmed and set by human beings. Those behaviors, and the policies they enact, should be as transparent as possible, not hidden behind corporate firewalls. Only then can drivers develop the trust required to make them human-aided cars and not drones that we sit in.

David A. Mindell is the Dibner Professor of the History of Engineering and Manufacturing and Professor of Aeronautics and Astronautics at MIT. He has 25 years of experience as an engineer in the field of undersea robotic exploration, as a veteran of more than 30 oceanographic expeditions, and more recently as an airplane pilot and engineer of autonomous aircraft. He is the author OUR ROBOTS, OURSELVES: Robotics and the Myths of Autonomy.

Tap to read full story

Your browser is out of date. Please update your browser at http://update.microsoft.com