First ever mental images extracted from human brain using AI have been shown

  • Researchers in Japan have developed a “brain decoding” technology
  • It leverages artificial intelligence to translate human brain activity into mental images
  • The approach produced vivid depictions of objects and landscapes

Published on Dec 20, 2023 at 5:11 PM (UTC+4)
by Adam Gray

Last updated on Dec 21, 2023 at 4:47 PM (UTC+4)
Edited by Amelia Jean Hershman-Jones

First ever mental images extracted from human brain using AI have been shown

There’s a famous saying: “penny for your thoughts”.

It’s something you say when you want to know what another person is thinking, usually because they’ve been quiet for a while.

Well, you no longer have to part with your hard-earned cash, as Japanese researchers have developed a “brain decoding technology”.

READ MORE! Man creates video to envision what Dubai will look like in 2060

The tech, which leverages artificial intelligence (AI), translates human brain activity into mental images.

And now the first ever mental images extracted from a human brain have been shown.

The team, consisting of researchers from the National Institutes for Quantum Science and Technology (QST) and Osaka University, have translated human brain activity into mental images of objects and landscapes.

The approach has produced vivid depictions such as a distinct leopard with discernible features like ears, mouth and spots, and objects like an airplane with red-wing lights.

While past research has managed to recreate images people have seen by analyzing their brain activity, making these mental images visible to others is difficult.

Only a few studies have successfully shown mental images, and these images were usually limited to certain categories like human faces, letters, or shapes.

“Therefore, visualizing mental imagery for arbitrary natural images stands as a significant milestone,” the study’s researchers said.

So, how did they manage to crack it?

The researchers exposed participants to around 1,200 images and then meticulously analyzed and quantified correlation between their brain signals and the visual stimuli using functional magnetic resonance imaging (fMRI).

The mapping was then used to train a generative AI to decipher and replicate the mental imagery derived from brain activity.

But what are the implications?

According to the researchers, the implications of this “brain decoding” could help in potential applications in medicine and welfare.

Essentially, it could be used to help create new communication devices, while also allowing scientists to explore and understand how hallucinations and dreams work in the brain.

The study was published in the scientific journal, Neural Networks.

# Tags - Lifestyle, Tech


user

Adam Gray

Adam Gray is an experienced motoring journalist and content creator based in the United Kingdom. Using his media accreditation with manufacturers’ press offices, Adam test drives the latest cars and attends new vehicle press launches, producing written reviews and news pieces for supercarblondie.com. Before joining the Supercar Blondie team, Adam was Motoring Editor for Portfolio North magazine, North East Motoring Editor at Reach plc, and provided motoring content on a freelance basis to several lifestyle and business publications in the North of England. When he’s not behind the wheel of the latest car, Adam can be found at his local rink playing ice hockey or supporting his beloved Middlesbrough FC.