Play Video:
Digital enhancement of the real world through augmented reality (AR) and virtual reality (VR) is becoming a more common sight, with new applications being developed on a frequent basis. The latest development comes from Meta, which has just revealed its latest project: a new wearable device that lets you control AR experiences by tracking your brain activity.
Meta’s system is called Meta 2, and it uses artificial intelligence to analyze brain activity in order to understand what the user is looking at or focusing on. This information can be used to control an AR experience in various ways; for example, the user might be able to double-click their forehead to activate an AR selfie mode or initiate an AR overlay of helpful information about objects they are viewing.
The company has also announced that it will begin selling its first developer kit for Meta 2 next month, with pre-orders available from today at a cost of $949 and up. These kits will also include tutorials, documentation, and software so developers can get started creating apps right away. In addition to this news, Meta has revealed details of several forthcoming products - including its first consumer headset.
What Is Meta 2 And How Does It Work?
Meta 2 is a next-generation augmented reality platform that lets you view and interact with digital content in a completely seamless way. Using an AR headset and a hand tracking device, the system allows you to see and manipulate holograms that blend seamlessly with the real world.
The system uses a combination of computer vision, artificial intelligence, and sensors to map the environment in real time, recognize what is being looked at, and generate a responsive AR image. It features a binocular, see-through display to provide a high-resolution AR experience, as well as hand tracking technology that can recognize finger movements and gestures.
The company recently revealed a new version of the system, Meta 2, which features a new design and improved capabilities. This model is lighter and more comfortable, with improved optics and a wider field of view. It also features a replaceable and rechargeable battery, longer battery life, and a magnetic clip for attaching the helmet to a desk or other surface.
With a “gentle flick of the thumb”, you can access your messages on the mysterious AR glasses, and a “quick movement” enables you to reply with a pre-programmed answer. Zoom slider camera control is accessible through another gesture, allowing you to capture what you see on the glasses and send it to someone immediately.
How Can We Control AR With Our Brains?
For most AR applications, users need to manually select content, click controls, and type text. Meta 2 uses artificial intelligence (AI) and computer vision to recognize what is being looked at and determine the appropriate action. For example, if the user is looking at an apple, the apple might become clickable, with options to add the apple to a shopping list or find out more information about the apple’s nutrition facts.
Or if someone is looking at a car, the car might become clickable to show information about the car’s engine, cost, and features. The headset also uses brain activity to control AR. For example, the user might double-click their forehead to initiate an AR selfie mode, or they might double-click their temples to initiate an AR overlay of helpful information about objects they are viewing.
Using Brain Activity To Control AR
The company has also announced that it will begin selling its first developer kit for Meta 2 next month, with pre-orders available from today at a cost of $949 and up. These kits will also include tutorials, documentation, and software so developers can get started creating apps right away. In addition to this news, Meta has also revealed details of several forthcoming products - including its first consumer headset. “This prototype shows how we can make these interfaces faster, higher bandwidth, and more natural,” Mark Zuckerberg said during the demonstration. He also emphasised that this research phase is still in progress, mentioning that the company had previously studied electromyography (EMG). However, the demo itself was 'pretty mind-blowing.'
Conclusion
Augmented reality is expected to become a $108 billion industry by 2024, and with new developments like Meta 2, it’s not difficult to see why. Such a device is a huge leap forward for AR technology in general, and we can expect to see significantly more advanced AR as a result.
The product is certainly not without its flaws - it’s expensive, heavy, and clunky. But it nonetheless represents a significant next step in the development of AR technology, and one that is sure to transform the way we interact with digital content. Despite Google Glass’s failure years ago, the concept of AR glasses excites me.
Seeing Meta in action is cool, and I believe that Apple could learn from this Connect and develop a slicker VR headset or AR glasses. Apple has been rumored to launch its own VR headset or AR glasses, and I hope that it will take some pointers from this Connect and do so.
Post a Comment
First of all, I would like to say thank you so much for visiting my blog. Please provide a comment that is relevant to the article. The best comment will be approved.