MIT’s Alter Ego Headset Can Read Your Thoughts By Using The Internal Verbalisation: Hello, Everyone Today I am going to share some exciting facts on the MIT’s Alter Ego Headset Can Read Your Thoughts by Using the Internal Verbalisation.
MIT’s Alter Ego Headset Can Read Your Thoughts By Using The Internal Verbalisation
The holy grail of brain-computer interfaces is the ability to read the thoughts of the user, leading to a host of the applications that can utilize the mind to control devices and information directly.
A team of the researchers at Massachusetts Institute of Technology’s Media Lab has taken a step along that path unveiling a prototype headset.
It can pick up sub vocalization which is the natural process of internal verbalization or inner speech usually performing when reading to aid cognition and characterization by minuscule movements in the larynx and other muscles associating with the statement.
Calling AlterEgo, the wearable uses electrodes for detecting neuromuscular signals in the jaw and face to transcribing words the user did not speak aloud, but reads or otherwise sub vocalizes.
The prototype AlterEgo headset is developing by the team at MIT’s Media Lab, leading by Arnav Kapur, using machine learning to correlate neuromuscular signals with the words it has trained on to identify them with the accuracy later. After training, the ML system has an average transcription accuracy of about 92 percent.
Kapur says this number will improve with increased training. Also known as a wearable silent-speech interface, the AlterEgo headset can use as few as four electrodes on either side of the mouth and jaw to consistently picking up the neuromuscular signals needed to distinguishing subvocalized words.
The electrodes are combining with a pair of bone-conduction headphones that convey information to the user with the vibrations without “interrupting a conversation or otherwise interfering with the user’s auditory experience.”
The team has thus far used the AlterEgo headset to assist in several tasks, such as transmitting opponent chess moves and receiving computer-recommended steps in response or providing answers to large addition or multiplication problems.
It makes it a sort of intelligence-augmentation device, unobtrusively providing solutions to the computationally complex problems. Other applications, of course, include the ability to control interfaces, with the commands such as up, down, or select is picking up.
Current training models are mean for the tasks that require identification of limited vocabularies about 20 words each.
In their usability study with the Alter Ego headset, the team has a ten subjects spending about 15 minutes each customizing the arithmetic application to their neurophysiology, then spend another 90 minutes using it to execute the computations.
“The motivation for this to build an IA device which is an intelligence-augmentation device.”
“Our idea that we have a computing platform that is more internal which is melting human machine some ways, and that feels like an internal extension of our cognition.”
“We basically cannot live without our cell phones, our digital devices,” adding Pattie Maes, a professor of the media arts and sciences and Kapur’s thesis advisor. “But at the moment, the use of that devices are very disruptive.
I want to look something up that is relevant to a conversation I am having to find the phone and type in the passcode and open an app and type in some search keyword, and the whole thing requiring that I shift entirely attention from my environment and the people that I am with to the phone itself.
So, my students and I have for a very long time experimenting with new form factors and new types of experience that enable people to still benefit from all the excellent knowledge and services that these devices give us, but do it in a way that lets them remain in the present.”
So, these are the MIT’s Alter Ego Headset Can Read Your Thoughts by Using the Internal Verbalisation.
If any Questions is persisting then, please feel free to comment your viewpoints.