Recently, both Google and Samsung have presented information about upcoming products. Google presented information about its upcoming smartphone Pixel 4 and Samsung presented e.g. its new S Pen for the Galaxy Note series. It’s exciting seeing that the two tech dragons have chosen to include touchless gesture control in their new devices, as it will give them important feedback on one of the most intuitive interaction solutions. Crunchfish is a market-proven provider of gesture control software, already having its gesture control solution installed in tens of millions of devices.
First smartphone with Project Soli
Google Pixel 4 is the first smartphone to include Project Soli, which is a motion-sensing radar developed by Google. The radar is located at the top of the Pixel 4. It senses small motions around the phone and can thereby recognise gestures. Google call its gesture software Motion Sense and in the Pixel 4, Motion Sense will allow users to skip songs, snooze alarms, and silence phone calls, just by waving the hand. Samsung’s new S Pen for its Galaxy Note smartphones works basically like a magic Bluetooth wand. The stylus pen has a motion sensor that can determine how it’s being held and in what direction it’s being moved. This means that the user can switch between different functions, such as switching between different camera modes, just by waving the S Pen in the air.
The usage of gesture control will increase globally
The installation of touchless gesture control in Google and Samsung devices will give important feedback and also increase the usage globally. The feedback will be valuable input for future screen-less devices such as smart AR glasses, where gesture control and voice control are the two intuitive interaction methods. Samsung recently filed a patent application regarding a pair of smart glasses and Google recently announced the second generation of Google Glass.
Crunchfish’s gesture control provides the user a full interaction experience
What Google needs a radar and Samsung needs a stylus to do, Crunchfish can achieve just by using software and the camera sensors in mobile and wearable devices. Using advanced image recognition and deep learning algorithms, our software detects and tracks hand gestures in three dimensions. Crunchfish’s gesture control provides the user a full interaction experience, which is not limited to simple air swipes. Further, as Google’s gesture control is using radar, it will because of legislation not be available in all countries. Crunchfish’s gesture control can be used world-wide.
Our latest product XR Tracking 2.0 offers the unique experience of a mouse-like UI paradigm by just using a hand. The position of the hand is tracked with very high precision for a responsive UI feedback.
Check out our XR Tracking 2.0 in action in this video:
Want to try our solution for yourself?
To show the potential of our powerful XR Tracking 2.0, we have made a demo application publicly available. The demo demonstrates Crunchfish’s solution designed for efficient touchless interaction with most standard AR smart glasses and Android devices. Using the pinch as well as an open and closed hand, interactions such as navigation, item selection, annotation and zooming can be performed. The demo is available on:
- Epson BT3xx
- Lenovo New C200
- RealWear HMT-1
- Vuzix M-series (external link)
- Smartphones (mobile AR)*
*Compatible with the majority smartphones running Android 5.0 or later.
Interested in bringing gesture controls to your AR hardware or software? Please contact our sales team:
Henrik Winberg, Sales Director at Crunchfish
+46 (0)702 126 129
Joakim Nydemark, CEO at Crunchfish
+46 (0)706 351 609