ManoMotion is a company that specializes in hand-tracking and gesture-recognition technology. They have developed a software development kit (SDK) that enables developers to integrate hand-tracking and gesture-recognition capabilities in their applications.
The ManoMotion SDK is designed to work with various platforms, which include mobile phones, augmented reality (AR) and virtual reality (VR) headsets, and other interactive systems. By using this SDK, we can create applications that recognize and interpret hand movements, allowing users to interact with digital content in a more intuitive and interactive manner.
The features in the ManoMotion SDK can primarily be divided into two broad categories:
Hand-tracking
Gesture recognition.
The following table shows features categorized into these two categories:
Hand-Tracking | Gesture Recognition |
Skeleton tracking | ManoClass |
Hand segmentation | Continuous gestures |
Try-on features | Trigger gestures |
Let’s look at these features in more detail:
Skeleton tracking: It contains the confidence and joint information of the skeleton. The joints are provided with individual positions for all the joints in the human hand. Furthermore, the SDK allows joint information to be given in both 2D and 3D and allows simultaneous tracking of both hands.
Hand segmentation: The SDK can extract and place the hand texture in 3D. This is done using depth estimation, which allows developers to create experiences in which the hands can occlude virtual objects.
Try-on features: This feature enables AR developers to create projects to provide try-on solutions for clothes or accessories.
ManoClass: This feature represents the raw detection of the hand in every frame. For each frame, the SDK checks the gesture and returns an appropriate class result.
Continuous gestures: This feature is used to check for continuous gestures. This means that the continuous gesture will be detected only if the user has maintained the same pose for a set number of frames.
Trigger gestures: This feature triggers an event according to a gesture provided.
Note: You can read up on how to do hand gesture recognition in more detail.
Now, let’s briefly look at how to incorporate this SDK into our project.
The following image shows the general steps that we need to follow to use the ManoMotion SDK:
Now, let’s look at these steps in detail:
Registration and setup: To use this SDK, we initially have to register as a user at ManoMotion. Once we have done that, we can download and use this SDK.
Integration in application: We add the sample codes and dependencies in the required files to integrate the SDK. After doing this, we initialize the SDK by setting up the required parameters according to the feature we use.
Hand-tracking and gesture recognition: We implement the features discussed earlier according to our project’s use case. We use the provided APIs and implement event handlers to respond to specific hand gestures or movements.
Finally, let’s see the various use cases of this SDK.
The following image shows a few use cases of the ManoMotion SDK:
Let’s briefly look at these use cases:
Gaming: This SDK can be used to implement hand tracking and gestures to add interactivity in AR/VR games.
AR/VR applications: It can be used to control and manipulate objects in VR applications and in fashion and accessories e-commerce sites.
Healthcare: It can be used to implement touchless interfaces with natural hand interactions for navigation, object manipulation, and interaction with virtual objects. It can also be used to track hand movements in surgery simulations to train new doctors.
Education and training: It can be used to develop interactive educational applications where the user can manipulate virtual objects using hand gestures.
Presentation and design: It can be used to create interactive presentations using various hand gestures. It can also be used to create interactive architectural designs where users can navigate and manipulate the 3D models using hand gestures.
The ManoMotion SDK is a valuable tool for developers who want to incorporate hand tracking and gesture recognition in their applications without training the computer vision models.