TIME technology

Google Glass Doesn’t Have a Privacy Problem. You Do.

2013 Google Developer Conference Continues In San Francisco
2013 Google Developer Conference, May 17, 2013 in San Francisco, California. Justin Sullivan—Getty Images

Blaming a particular device is a lot easier than confronting, for example, America's cultural faultlines and problems of power and privilege.

Brace yourself for more Glassholes. Beginning last week, Google started to make available a limited supply of Google Glass to anyone with $1,500 to spare. To be certain, Glass still affords—or makes possible—a whole range of problematic behaviors. There are concerns about distracted driving; about people (probably men) taking photographs of other people (probably women) without consent; about Glassholes serving as foot soldiers in Google’s data-gobbling army, expanding the corporation’s ongoing assault on what we used to call privacy.

These are real issues but not new ones; rather, they are the newest manifestations of much larger long-standing problems. While Glass may make those problems more visible than they were before, hating Glass (or even Glassholes) won’t make the problems go away.

The backlash against Glass is a result of what I call foregrounding—when a new technology makes some pre-existing aspect of society more visible and, in so doing, is mistaken for having caused the phenomenon in question rather than having brought it to increased attention. Social scientists have long been aware that this happens and have described it with many different names. I like to use foregrounding, however, because it emphasizes that while what we’re seeing may be more salient than it was before, the phenomenon has been there all along.

Contentious or unsettling new technologies like Glass are therefore important signals and can draw our attention toward issues that we as a society need to address. Consider “creepshots”—suggestive photographs taken without the knowledge or consent of the people pictured (who are almost always women). It may well be easier to take a creepshot with Glass than with a smartphone, but Glass doesn’t cause creepshots. Creepshots were a problem before the advent of face-mounted computers, and they will continue to be a problem until we deal with the much bigger problem of why anyone would take one in the first place.

That “why” is an incredibly complicated tangle of issues, and blaming one particular device is a lot easier than confronting, for example, American culture’s deeply entrenched sexism and misogyny. But banning Glass will not stop creepshots, because creeps will take creepshots even if they can’t be Glassholes. Even smashing every camera on earth won’t solve problems that, at their core, are problems of power and privilege.

Similarly, banning Glass will not change the fact that Google—and Facebook, the National Security Agency and a whole range of other actors—are collecting vast troves of information about nearly all of us, nearly all of the time. Moreover, Glass is far from the only product or service to make information about people other than its users available to a corporation. That your privacy is affected by the products other people use may be more “in your face” with Glass, but again: this is already happening, and it will continue to happen until we address the much larger issues that surround data production, access, ownership and use.

As I’ve written elsewhere, we are caught between two conflicting privacy paradigms—but neither paradigm takes into account the fact that our privacy, or lack thereof, is not a product of our individual choices. Until we reconceptualize privacy on both the social and policy levels, the “privacy problem” will persist.

It’s doubtful, of course, that Glass will catch on in any meaningful way; as ever and always, “Google doesn’t get social.” But Glass is not the first wearable computer or camera out there, nor will it be the last. Samsung is rumored to be working on something like Glass; Apple insiders predict a smartwatch; the Narrative Clip, a wearable “lifelogging” camera, is already in production; and MIT has been working on Wearable Computing for close to 20 years.

We shouldn’t accept these devices as inevitable, but neither should we assume that by eliminating the devices, we can eliminate the cultural fault lines that the devices foreground. Instead, we need to ask hard questions about what kind of a society we want to live in—and then demand that new technologies move us closer to that goal, not further away from it.

Whitney Erin Boesel is a fellow at the Berkman Center for Internet & Society at Harvard University, a visiting scholar at the MIT Center for Civic Media and a Ph.D. student in sociology at the University of California at Santa Cruz. She’s active on Twitter as @weboesel.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser
Follow

Get every new post delivered to your Inbox.

Join 46,147 other followers