Site icon Brief News

Proposed Human Rights Would Protect Your Mind From the Likes of Facebook

Photo: Getty

“It sounds impossible but it’s closer than you may realize,” Facebook’s Regina Dugan recently told audience members at the F8 developer conference. Dugan was referring to the social network’s plans to read users’ thoughts. Just in time to inject some practical considerations into that terrifying scenario, researchers have proposed four new human rights to protect our minds from those who might have the worst intentions.

Remarkable advances are being made in technologies that can analyze brain activity every day and we’re reaching a point that it might be time to clearly define the ethics of peeking into others’ thoughts. Marcello Ienca, a neuroethicist at the University of Basel, and Roberto Andorno, a human rights lawyer at the University of Zurich, have some initial ideas for how we might approach these new human rights. Their proposals were published in the journal Life Sciences, Society and Policy today. Much like Asimov’s Three Laws of Robotics, we’ll likely be debating what should and shouldn’t be allowed in the expanding fields of tech that make reading and altering brain activity possible.

Advertisement

Advertisement

The four proposed human rights break down like this:

Cognitive Liberty

The right to cognitive liberty would ensure that a person would be allowed “to alter one’s mental states with the help of neurotools as well as to refuse to do so.” For instance, one would have the choice to opt-in or out of techniques like the military’s experimental brain stimulation program that allegedly enhances job skills or new methods for improving and recovering memory. If adopted, this might protect citizens or workers from being forced to accept the treatments by the government or an employer.

Mental Privacy

This one’s fairly self-explanatory, but it would make it illegal to read someone’s mind without their consent. We still have a lot of work to do in the field before scientists will be able to just choose to read anything they want in a person’s mind but it’s still worth defining protections early. Even in the experimental stages of development, test subjects should be allowed to define what areas of their brain information are open for viewing and how that information can be used later. Of course, when Facebook finally gets its mind-reading program off the ground, you’d have to sort through the massive terms of service agreement to find what areas of your mind belong to them. Still, it’s good to have legal protections defined so that the terms of service are required to inform you what is and isn’t off limits.

Mental Integrity

The right to mental integrity would shield people from malicious brain-hacking. Brain chip implants are getting closer to mainstream reality and influential entrepreneurs like Elon Musk are jumping into the business. But if our brain chips are as insecure as our devices connected to the internet of things, we could be in for a world of pain. The theoretical sky is the limit when it comes to malicious brain hacking because it would depend on what function the brain chip implant is performing. It’s possible that a hacker could take control of a person’s voluntary functions or even have their brain completely bricked.

Psychological Continuity

Protecting one’s psychological continuity is similar to ensuring their mental integrity but may be more urgent. Deep brain stimulation (DBS) is already being tested in treating mental health issues like depression. In one study that treated patients with Parkinson’s disease, dramatic improvement in motor disability was found. But the subjects also reported a feeling that they had lost part of their personality. More than half of the patients said things like “I do not feel like myself anymore”, “I feel like a robot” or “I have not found myself again after the surgery,” following the treatment. The right to psychological continuity would protect subjects from treatments or enhancements that might harm their sense of identity.

Sponsored

It may seem a little early to be defining these rules when all of the technologies that are involved are still in early experimental stages. But remember that we didn’t set up basic guidelines for net neutrality before corporations were deeply invested and that doesn’t appear to be working out so well. Better sooner than never.

[The Guardian]

Let’s block ads! (Why?)

Gizmodo

Exit mobile version