It’s not just online activities that need protection – our brains are next

It won’t surprise anyone who knows I’m in the IT business to discover I’m concerned about cybersecurity. It’s a constant battle to educate people and help them keep a step in front of those who are targeting their activities, data and wealth. Globally, we’ve rushed to adopt new technologies, and we’ve enjoyed the benefits while often failing to realise the potential consequences of a data breach. Every week there’s a story in the news and an apology from a big firm that’s messed up and allowed their systems to be hacked. Sometimes, it’s a case of negligence. Often, it’s a case of ignorance – if you don’t treat data as valuable why would you bother to protect it?

Let me ask you a slightly different question: how do feel about mind-reading technology? Before we get into a discussion of whether such technology is viable – and it is – let’s stick with the basics. Instead of thinking about data security, think about whether you’d want to protect your thoughts, your ideas, your memories, your political views and your secret desires? I’m guessing that while some of you may say you have nothing to hide, you’d also consider your brain’s activity to be yours and yours alone. You would want to be able to protect its privacy.

Technology is advancing at speeds we could not have comprehended even a few years ago. I love tech, so I’m not complaining, but we mustn’t forget that every advance could have applications that aren’t quite what we are expecting. Consider wearable technology. If your watch monitors your heart rate and tracks your steps, you could use that information to improve your health. If you share that data, healthcare providers could use it to scale their activities or promote healthier lifestyles. But such data could also become a stick, not a carrot. Imagine your employer using it to ensure you’re not slacking on the job.

That’s not science fiction. It’s happening now with the aim of improving productivity and efficiency. There are multiple possible applications of wearable tech already in use: exoskeletons that help workers avoid fatigue, goggles that overlay task instructions in an operative’s field of vision, headphones that translate speech and convert it straight to data. There is, however, one area that I find more than a little disturbing and that’s the increase in wearable EEG devices.

For just a couple of hundred pounds, there are headsets available that monitor your brain activity. They are already used on China’s railways to detect the lapses in concentration amongst train drivers. They can allow gamers to control games with the power of their thoughts. They can allow those with physical impairments to communicate by converting brain activity to electrical signals.

There are obvious benefits, but the brain has until now always been our last refuge. One day, probably, in the not too distant future, scientists will become far more sophisticated in their ability to understand what we are thinking. Already they can interpret our brain waves when we think of simple things including numbers, shapes and basic images.  The race to understand what our grey matter is doing is on, and it’s supported by the ever-advancing capabilities of artificial intelligence.

This is where I believe we have an ethical issue. Are our thoughts sacrosanct? Should they be? We stumbled into a cybersecurity crisis because we didn’t value data highly enough, but we’re embracing wearable tech without examining very similar issues. In the rush to understand the secrets of the brain, we may lose the last place we can enjoy real privacy.