The device reads and translates brain signals into computer commands – but also raises questions about what data it will be collecting.
Facebook is working on a neural wristband that uses a person’s motor signals to help them control the company’s upcoming augmented reality glasses.
The tech giant revealed its research into a wristband that uses neural technology to detect electrical signals sent from the brain to the hand, and could direct these towards commands on an augmented reality interface.
It creates something similar to a virtual mouse or keyboard. Imagine swiping and tapping from a distance, with that command being followed on the AR display or screen. The tech could also be applied to gaming.
The product is being developed by Facebook’s Reality Labs division and builds upon technology from CTRL-Labs, a start-up it acquired in 2019.
“What we’re trying to do with neural interfaces is to let you control the machine directly, using the output of the peripheral nervous system — specifically the nerves outside the brain that animate your hand and finger muscles,” Facebook’s Thomas Reardon, who is director of neuromotor interfaces at Reality Labs, said.
However, this tech has raised eyebrows among privacy advocates. Ray Walsh, a digital privacy expert at ProPrivacy, said technology like AR wristbands and glasses raises questions over what data will be collected.
“Serving people information via AR glasses will give Facebook a whole new medium with which to track people, and having a neural band that is able to track nerve impulses could allow Facebook to track involuntary impulses,” Walsh said.
A neural wristband would change the way users interact with Facebook’s services and how they could be tracked.
“Up until now, users have had to click like or follow the link from an advert to give Facebook marketing information about themselves, with the rise of AR devices, eye movements and wrist neuron impulses will suddenly become trackable – giving Facebook a record of the things you like even when you’re not actively or even consciously engaging with them,” Walsh added.
According to Facebook, this technology is far from mind reading, but translating neural signals into commands can make computing more “human centric” by creating greater personalised features that are designed to react specifically to the user’s motions.
The company said it will be adding haptic features, which would allow the user to ‘feel’ the interactions on display. Haptics in AR and VR has been explored by several other companies too.