Mobile Computing

Mobile Computers, Handheld, Rugged, Wireless (Wi-Fi), GSM and GPRS enabled PDAs and Industrial Computers.

Crunchfish launches new product specially designed for interaction with AR smart glasses

31-Oct-2018
Crunchfish launches new product specially designed for interaction with AR smart glasses
A new generation of Crunchfish’s gesture interaction software is launched. This high-precision one-gesture software solution is specially designed for interaction with any standard AR smart glasses enabling a new intuitive interaction method. It makes it possible for the user to control all functions in the glasses with only one gesture, similar to the touchscreens tap interaction.

The new product release from Crunchfish is specially designed for AR glasses and tackles one of the major challenges in adoption of AR solutions - the interaction, which is closely interlinked with usability.

One gesture controls all functions
A pointer tracks the hand and a pinch-gesture is used to activate the different functions in the AR glasses. Examples of functions to control are zoom, drag & drop of objects, draw in the AR-view and start & stop a video call.

High similarity to present touch screen interaction
The new solution creates intuitiveness and simplicity, due to the similarity with the touchscreen interaction and that there is only one main gesture for all interaction. It also secures an important tactile feedback in each gesture.

Focus on user experience in AR is key to success
So far, AR projects have focused on technology to provide the AR experience rather than methods that allow users to better engage with the content. It is crucial to include the user experience to secure confidence in AR and avoid project failure. The industry organisation AREA defines user experience as one of the top five challenges in roll-out of AR-solutions.

Neural networks and AI important parts of the solution
The new product release was made possible through significant technological advances by Crunchfish's development team in the field of Neural Networks and AI. In the new product, software and algorithms are substantially faster and more accurate in detecting hand movements and fingers position, which opens up a number of new use cases.

The first release to customers will be available during November 2018.

See the demo film.