• Ideas
  • Technology

‘Cognitive Liberty’ Is the Human Right We Need to Talk About

10 minute read
Farahany is a Robinson O. Everett Professor of Law and Philosophy at Duke Law School. She is the author of The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology

Imagine yourself in this scenario, in a future that is closer than many of us realize:

You glance at the program running in the background on your computer screen, mentally move the cursor to the left, and scroll through your brain data over the past few hours. You can see your stress levels rising as the deadline to finish your memo approached, causing your beta brain wave activity to peak right before an alert popped up, telling you to take a brain break.

What’s unusual change about your brain activity when you’re asleep? It actually started earlier in the month. You compose a text to your doctor in your mind and send it with a mental swipe of your cursor: “Could you take a quick look at my brain data? Anything to worry about?”

Your mind starts to wander to the new colleague on your team, whom you know you shouldn’t be daydreaming about, given the company’s policy on intra-office romance. Then you start to worry that your boss will notice your amorous feelings when she checks your brain activity and shift your attention back to the present. You breathe a sigh of relief when the email she sends you later that day congratulates you on your brain metrics from the past quarter, which have earned you another performance bonus.

More from TIME

When you arrive at work the next day, however, a somber cloud has fallen over the office. Along with emails, text messages, and GPS location data, the government has subpoenaed every employee’s brain wave data from the past year. They have compelling evidence that one of your coworkers has committed massive wire fraud; now they are looking for his co-conspirators. Your boss tells you they are looking for synchronized rhythms of brain activity between him and the people he has been working with.

Advances in neuroscience are taking us closer to a reality like this one, where individuals, companies, and governments will be able to hack and track our brains in ways that fundamentally affect our freedom to understand, shape, and define ourselves. It’s not going to happen tomorrow, but we are rapidly heading toward a world of brain transparency, in which scientists, doctors, governments, and companies may peer into our brains and minds at will. Whether we are meditating, doing a math calculation, recalling a phone number, or browsing through our mental thesaurus for just the right word, neurons are firing in our brains, creating minuscule electrical discharges.

Read More: How Neuroscience Could Explain the Rise of Addictions, Heart Disease and Diabetes in 21st Century America

When a mental state like relaxation or stress is dominant, hundreds and thousands of neurons are firing in characteristic patterns that can be measured with an electroencephalogram (EEG). Scientists used to have to place electrodes directly on the periosteum—the inner layer of the scalp—to pick up brain waves. The procedure required surgery under anesthesia and carried risks, including fever, infection, and leaking brain fluid. Today, the electrodes can be placed externally, on the forehead or the surface of the scalp. Electromyography (EMG) can be used to detect the electrical activity in response to a nerve’s stimulation of the muscle in millivolts. Together, EEG and EMG give us a window on what our brain is up to at any given moment, including the instructions it is sending to the rest of the body.

Technological leaps in neuroscience and artificial intelligence have converged to give us consumer neurotech devices—a catch-all term for gadgets that connect human brains to computers, and the ever more sophisticated algorithms that allow those computers to analyze the data they receive. At first, neuroscientists rightly dismissed all these consumer devices as inaccurate and unvalidated, little better than toys. But as both the hardware and software improved, consumer neurotech became more accurate and harder to overlook.

Neurotech companies like SmartCap are already marketing technology to detect drivers who may be drowsy and prevent them from falling asleep at the wheel. A simple, wearable device that measures EEG can alert individuals who have epilepsy to oncoming seizures, while those with quadriplegia can type on computers using just their thoughts. Soon, smart football helmets may diagnose concussions immediately after they occur.

Neurotech devices can also track changes in our brains over time, such as the slowing down of activities in certain brain regions associated with the onset of conditions like Alzheimer’s disease, schizophrenia, and dementia. Not everyone wants to know if one of those conditions is in the cards for them, but those who do may benefit from having time to prepare.

This technology has enormous potential to help people lead better lives by taking charge of their own health and well-being. But the same neuroscience that gives us intimate access to ourselves can also give access to companies, governments, and all kinds of actors who don’t necessarily have our best interests in mind. Soon, we may be trading access to our brain activity to commercial entities for rebates, discounts on insurance, free access to social media accounts, or even as a condition for keeping our jobs.

In fact, it’s already begun. In China, train drivers on the Beijing–Shanghai high-speed rail line, the busiest in the world, must wear EEG devices while they are behind the controls, to ensure that they are focused and alert. Workers in some government-controlled factories are reportedly required to wear EEG sensors to monitor their productivity and their emotional states, and they can be sent home based on what their brains reveal.

Here in the U.S., we are into the second decade of a large-scale, federally funded project called Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) “aimed at revolutionizing our understanding of the human brain.” It supports work by private firms, universities, and government departments, including the U.S. Defense Department’s Defense Advanced Research Projects Agency (DARPA). DARPA’s neurotech research includes the Neural Engineering System Design (NESD) program, which “aims to develop an implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the brain and the digital world.” What’s more, the Next-Generation Nonsurgical Neurotechnology (N3), a program seeking to develop high-resolution brain wearables is “capable of reading from and writing to multiple points in the brain at once.”

Corporations are getting in on the act, too. Tech titans including Meta, Snap, Microsoft and Apple are investing heavily in technology that can track and decode our brains and mental experiences. Sterling Crispin, a former Apple employee and Neurotechnology Prototyping Researcher, who contributed to the development of Apple Vision Pro described how AI is then used to analyze that data “to predict if you are feeling curious, mind wandering, scared, paying attention, remembering a past experience, or some other cognitive state.” The data these sensors generate are at risk of being commodified just as the rest of our personal data that has been before. A fear that is already materializing.

In China, Hangzhou Enter Electronic Technology (Enter-tech), a Chinese-based company has a suite of consumer and enterprise neurotechnology devices with applications for education, psychological health, VR, and the military. Entertech has accumulated millions of raw EEG data recordings from individuals engaged in various sorts of activities, from mind- controlled video game racing to working and sleeping. It has already entered into partnerships with other companies to share that data. in November 2018, SingularityNET announced that it had entered a partnership with Entertech to analyze data gathered from Entertech’s EEG measurement products using its AI platform.

L’Oréal, the French beauty and fragrance world leader, has even launched a strategic partnership with Emotiv, a large neurotech company, to target fragrance selection to individual brains. It now offers in-store consultations to help consumers find the “perfect scent suited to their emotions” by asking the customer to wear a multi-sensor EEG-based headset to detect and decode their brain activity through powerful machine learning algorithms.

When scholar Shoshana Zuboff coined the concept of surveillance capitalism, our personal data had already been widely commodified and our ability to claw it back largely gone. It’s not too late to protect against that same fate for our brains. We stand at a fork in the road—where rapid advances in AI and neurotechnology could change our lives for the better or lead us to a more dystopian future where even our brains are hacked and tracked.

How do we choose the right path? By recognizing a new human right to cognitive liberty. Anyone who values their ability to have private thoughts and ruminations—an “inner world”—should care about cognitive liberty. But as with privacy protections in general, trying to restrict the flow of information generated by new technologies is pragmatically impossible, and potentially limits the insights we can gain into disability and human suffering. Instead of prohibiting neurotech developments, we should focus on securing rights and remedies against the misuse and manipulation of our brains and mental experiences. If people have the right to decide how and whether their brains are accessed and changed, and more importantly, have legal redress if their brain data is misused (say, to discriminate against them in an employment setting, health care, or education), that will go a long way toward building trust.

Several organizations are already assessing the risks of neurotechnology—the Organization for Economic Cooperation and Development (OECD), the United Nations, the Council of Europe, UNESCO, the IEEE Brain initiative, and the NeuroRights Foundation to name a few, have held meetings to discuss ethical progress in neurotechnology. A group of corporate executives and scholars have called for a White House task force to “craft a roadmap for the effective governance of applied neuroscience technologies,” similar to the Presidential Commission for the Study of Bioethical Issues, to which I was appointed in 2010.

Other disruptive technologies have developed effective models and best practices that can serve as a roadmap. CRISPR, a technology that allows scientists to make precision edits to our DNA, has garnered international attention and dialogue to develop ethical norms to maximize its benefits for society. In January 2015, the National Academies of Sciences, Engineering, and Medicine created a Human Genome Initiative to guide future conversations. The time is right to establish an international body to provide the same kind of oversight for neurotechnology.

Above all, we must establish the right to cognitive liberty—as an update to liberty in the digital age—to give us self-determination over our brains and mental and experiences and protect our mental privacy and freedom of thought. Currently, nothing in the U.S. Constitution, state and federal laws, or international treaties gives us true sovereignty over our own brains. A right to cognitive liberty would empower us to access information about our brains and change them if we choose to do so, while shielding the identifying information, automatic processes, memories, and silent images and utterances in our minds from others. We must act quickly to secure our brains and mental processes—while the choice is still ours to make.

The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology by Nita A. Farahany. Copyright © 2023 by the author and reprinted by permission of St. Martin’s Publishing Group.

More Must-Reads from TIME

Contact us at letters@time.com

TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.