Can AI read your mind? Yes, it can.
Researchers have invented an AI decoder that translates brain activity into language. Ain't science grand?
Agents K and J, saving the planet once again. Source: Queens Chronicle.
Virtually everyone I know has had the experience of having a conversation about some completely random topic and then seeing an ad for it online shortly thereafter.
The actual topic doesn't matter; it could be gooseberries, or the 1964 World's Fair, or Sammy Davis Jr's glass eye. The next day an ad for gooseberry pie shows up on your Instagram feed. Netflix recommends you rewatch "Men in Black." Rat Pack videos start appearing on your YouTube channel.
And you think, holy shit — my computer/phone is reading my mind. Or, at the very least, Alexa or Siri have been very Chatty Cathys in need of a stern talking to. [1]
At this moment in time, I can say with some confidence that your computer is not reading your mind. [2] Yet. But that day is coming, and it's closer than you might think.
AI knows what you're thinking
Earlier this month, researchers at the University of Texas at Austin published a paper detailing their success at using AI to read the private thoughts of individuals and translating them into written speech.
I'm just gonna pause here for a moment and let that sink in.
Last summer, three plucky volunteers spent 16 hours a day listening to podcasts and watching silent videos while a functional magnetic resonance imaging (fMRI) machine recorded what parts of their brains lit up (i.e., required more blood) when certain words or phrases were said.
They used a large language model (akin to ChatGPT) to map the visible brain patterns with the spoken words.
I am pretty sure at least one of these brains is thinking about pizza. Source: The-Scientist.
Then the researchers played the test subjects new recordings, and asked the AI to predict what they were hearing, based entirely on their brain activity. It wasn't an exact match, but it was close enough to make you want to hide under your covers for the next 20 years.
Per the New York Times report:
Almost every word was out of place in the decoded script, but the meaning of the passage was regularly preserved. Essentially, the decoders were paraphrasing.
Original transcript: “I got up from the air mattress and pressed my face against the glass of the bedroom window expecting to see eyes staring back at me but instead only finding darkness.”
Decoded from brain activity:“I just continued to walk up to the window and open the glass I stood on my toes and peered out I didn’t see anything and looked up again I saw nothing.”
Of course, there are a few caveats. Performing this magic trick required a $500K+ apparatus the size of an industrial washing machine. Right now, the decoder is specific to each individual -- you can't use Tom's decoder to read the mind of Harry or Dick. And you can fool the machine by thinking about something else instead (say, Sammy Davis Jr eating a gooseberry pie on the top of the Queens Space Towers).
But that's today. A couple of decades ago, speech-to-text technology required powerful processors, perfect conditions, and hours of training for each individual voice. Even then it was at best 95 percent accurate. (That translates into 100+ errors for each page of single-spaced text.)
Now we can talk to virtually any device, in any environment, without any prior training, and we get a little pissy if the machine doesn't immediately understand us. Mind-reading AI technology is likely to follow a similar curve.
Imagine an AI brain decoder/lie detector as an app on your phone or an Alexa "skill." Is that person really telling me the truth? Let's find out, shall we? [3]
This is your brain on Elon Musk
Computer brain interfaces are a hot topic of research. And rudimentary examples of these things have existed for some time. Usually they require a kind of skullcap containing diodes that measure micro-electric charges in the skin [4] and lets you control simple binary devices by changing your brain state — from, say, Alpha to Theta. [5]
A Clockwork Orange… you glad you’re not this guy? Source: Far Out Magazine.
But there are also people who are keen on inserting devices directly into your brain that let you communicate with, and control, your computer. One of those people is Elon Musk.
A few years ago he launched a company called Neuralink that is developing implantable brain-computer interfaces, and just this week he won FDA approval for testing them in humans.
I'm going to pause here again, mostly to keep from throwing up.
That is a bad idea for at least 4,267 reasons, but here's just one:
Neuralink employees told Reuters last year that the company was rushing and botching surgeries on monkeys, pigs and sheep, resulting in more animal deaths than necessary, as Musk pressured staff to receive FDA approval. The animal experiments produced data intended to support the company's application for human trials, the sources said.
In one instance in 2021, the company implanted 25 out of 60 pigs with the wrong-sized devices. All the pigs were subsequently killed — an error that employees said could have been easily avoided with more preparation.
That anyone would willingly allow a Bond villain wanna-be to insert a chip inside their brains is beyond my comprehension. Then again, Musk has 140 million slavish followers on Twitter, who probably don't have all that much to lose.
Would you let a machine read your mind? Post your thoughts below, and feel free to share this post with any friends and relatives who still have 2+ brain cells to rub together.
[1] I've had the same experience, and I cannot explain it either. My best guess is that the algorithms that predict who we are and what we might be interested in at any given moment have gotten really, really good.
[2] I cannot say the same about Alexa or Siri. Of course, if Apple or Amazon really were secretly eavesdropping on our conversations and using them to serve ads, the FTC, FCC, DOJ, and nearly every other three-letter agency would be conducting a thorough legal colonoscopy of Tim Cook and Jeff Bezos. One would hope, at least.
[3] Thus striking terror in the heart of every husband who's ever been asked, "Does this dress make me look fat?"
[4] Wearing one of these skullcaps at a trade show I was once able to tell a machine to dispense a beer just by using my mind. I have since been unable to repeat this feat without the skullcap, though lord knows I've tried.
[5] There are five brain states, each of which operates at a different frequency: Gamma (concentrating), Beta (distracted), Alpha (relaxed), Theta (falling asleep while trying to meditate), and Delta (sleeping). There is also a newly discovered sixth state, Zeta, which occurs when your brain is consumed by images of Mrs. Michael Douglas.
I’ll show myself out now.
Musk is what happens when GenX kids are allowed to watch Bond movies unsupervised.
Laugh out loud and terrifying at the same time. Well done!