clock menu more-arrow no yes

Four ethical dilemmas raised by the brain science of the future

Getting a fake memory in the 2012 version of Total Recall
Getting a fake memory in the 2012 version of Total Recall
Sony Pictures

We're living in the age of neuroscience. New technologies are enabling scientists to peer ever more deeply into the human brain and understand how it works.

But those technologies can also raise some surprising new ethical dilemmas — from questions about brain privacy to inequality issues around cognitive enhancements.

A new report from the Presidential Commission for the Study of Bioethical Issues tries to get out in front of these problems by presenting some of the ethical questions that neuroscience technologies will raise — as well as some recommendations for how to integrate ethics into science.

The report highlighted four questions in particular that could become more important as neuroscience advances:

1) How private are our thoughts?

Neuroimaging technologies such as fMRI, EEG, PET, and CT are great because they can look at brain activity when you think and do things. They are also terrifying because they can reveal specific patterns when people have certain thoughts. And that raises all sorts of potential issues.

Several groups, for example, have been exploring the use fMRI for lie detection. The report specifically brings up an example from the lab of Mark George at the Medical University of South Carolina. In a study published in 2005, the lab built a computer model from fMRI data that was able to predict whether a research subject was telling the truth about stealing something with about 90 percent accuracy.

We already have a precedent for using lie detection technology in our legal system, as imperfect as that technology is. But the report notes that neuroimaging also has the potential to someday be used for crime prevention or to determine criminal intent. That's pretty much the Minority Report scenario, where people get punished for predicted future crimes.

Another example that the Presidential Commission's report didn't bring up: other researchers have had some success recreating an image that someone is seeing by scanning her brain. As this technology improves, will judges start awarding search warrants for brains to pull memories out of them? What happens to the right to remain silent? (Or, perhaps even worse, what if the technology isn't perfect, and they grab a dream instead of a real memory?)

2) How should we plan for diseases like dementia?


Dementia — which is a severe decline in mental ability — currently affects 2 to 6 million Americans and will likely affect more and more people as our society ages.

As new technology improves our ability to predict diseases such as Alzheimer's, many people will receive diagnoses far in advance and will want to plan for their future selves. (This planning might include advance directives, legal documents that spell out preferences for future care if the time comes when you are unable to speak for yourself.)

But that prospect raises its own dilemmas. Consider a patient who decides in advance that he doesn't want invasive treatments if his cancer should return. Then, years later, the patient develops dementia, and the cancer is back. He then changes his mind and decides he wants the treatment, after all.

Which "person" should caregivers follow: the sharper one from the past or the one who exists today? Which mindset takes precedence?

The person today still could have certain rights of autonomy, yet doesn't have the same decision-making capabilities as in the past. How do you balance what seems best for the person with what that person wants, if the two differ?

Even though we're already facing these exact questions with aging members of our community right now, there's no consensus on what the right answers are. The Presidential Commission's report notes that when facing dilemmas like these, what our culture considers to be selfhood may even change.

3) Who will benefit from cognitive enhancements?

The future could well see a wide array of cognitive enhancements that make people smarter, from drugs to whatever kind of brain machines. And even if you think it's fine to push the brain beyond its normal capabilities, cognitive enhancement still raises ethical questions about health risks and equal access.

Here's a present-day example from the Presidential Commission's report: many students engage in the non-medical use of prescription stimulants like Adderall to try to do better in school. There are possible side effects — not to mention the possibility of getting kicked out of school — but some people genuinely believe it helps them study. One 2013 survey reported that about 10 percent of college undergrads had used such stimulants without a prescription in the past year.

But this behavior is unequally distributed: the users of these drugs are more likely to be white, male, members of fraternal organizations, and attending more selective colleges.

The Presidential Commission's report concludes: "These data could raise concerns about justice and equity, insofar as using stimulants in this way might be viewed as conferring or reinforcing advantage and exacerbating existing educational, economic, and other disparities."

4) Should we use deep-brain stimulation to treat mental disorders?

Deep-brain stimulation is a technique that uses electrodes implanted in the brain to produce an electrical current. The technology is already FDA-approved for treating Parkinson's disease and a movement disorder called dystonia. It's also been tested in people for severe psychiatric illnesses, including obsessive-compulsive disorder and depression that doesn't respond to other forms of treatment.

But invasive brain surgeries like these also have a checkered past — particularly in treating mental health disorders.

Back in the 1940s and 1950s, doctors in the US performed more than 40,000 lobotomies. These procedures, which severed connections in brain, were oversold as miracle cures even though there was a lack of scientific evidence that they could actually help patients. Doctors were aware of their terrible side effects, such as the blunting of emotions — but these downsides rarely made it into published scientific papers. There were also issues with "dubious consent" of patients who may not have had the ability to fully advocate for themselves.

The Presidential Commission's report points out that researchers using deep-brain stimulation have been paying very close attention to these sorts of ethical questions. So far, researchers have worked with ethicists and patient advocates to create standards that include fully informed consent and the management of expectations (so people don't jump into treatment out of blind hope and desperation).

Further reading: Why are scientists trying to map every neuron in the human brain?