On Thursday, Facebook provided a prototype wristband that uses a combination of artificial intelligence and input from the nervous system, to enable the wearer to interact with virtual reality and augmented reality environments.

The prototype AR console provides simple gesture-based interaction rather than clicking a button.

This enables applications such as virtual bow and arrow launch, and with wrist-based touch the device can approximate the sensation of pulling the bow chain back.

Facebook researchers say the wearable augmented reality console will someday provide more advanced capabilities, such as the ability to touch virtual interfaces and objects and capture virtual objects remotely.

Ultimately, the technology allows you to type via a virtual keyboard on a table or in your hands at a faster speed than is possible with a physical keyboard.

"When neural interfaces work properly, they feel like magic," says Thomas Reardon, director of motor neuron interfaces at Facebook's Reality Labs.

Facebook is investing massive sums in research and development for the next generation of human-machine interfaces.

The social media giant still sees virtual reality and augmented reality as key areas of growth, and wants to be at the forefront of creating enabling technology for the way people use computing platforms for the next decade and beyond.

The Realty Labs team is also working on developing a context-aware interface powered by artificial intelligence for augmented reality glasses.

And at the Facebook Connect conference last year, the company announced a new line of smart augmented reality glasses, starting with "Ray Ban" models due to be launched sometime in 2021.

"AR glasses enable us to exist and connect, and how we communicate with this new device will be critical," Andrew Bosworth, president of Realty Labs, said in a tweet.

Facebook says it plans later this year to unveil its soft robotics work to build comfortable, all-day wearable devices, as well as provide an update to its tactile glove research.

Facebook confirmed that the approach it uses with the wrist-based augmented reality controller is not the same as mind reading. Instead, the controller uses electromyography, or EMG, which uses sensors to interpret motor nerve signals The electrical transmitted from the wrist to the hand and then into digital commands that you can use to control the functions of the device.

"What we are trying to do by using neural interfaces is to allow you to directly control the device, using the output of the peripheral nervous system ... specifically the nerves outside the brain that move the hand muscles and fingers," Facebook said.