Multi-touch is graciously sought-after attribute on our touch-based devices at this point in time. furthermore it give the impression of the larger screen, the higher the demand, as users look to take benefit of the compounded display real estate with as sundry shrugs as on the cards. Multi-touch technology began in 1982, when the University of Toronto’s Input Research Group developed the first human-input multi-touch system.
As soon as it emanates to Android-based modding and enhancement, there isn’t a further enthusiastic and erudite place on the web than XDA Developers, and they’ve come through hitherto for a second time with an alternative nugget.
Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from the face and hand gesture recognition. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques.
Gesture recognition enables humans to interface with the machine (HMI) and interact naturally without any mechanical devices. Using the concept of gesture recognition, it is possible to point a finger at the computer screen so that the cursor will move accordingly. This could potentially make conventional input devices such as mouse, keyboards and even touch-screens redundant. In computing, multi-touch refers to a touch sensing surface’s (trackpad or touchscreen) ability to recognize the presence of two or more points of contact with the surface. This plural-point awareness is often used to implement advanced functionality such as pinch to zoom or activating predefined programs.
Multi-touch gestures can already be found within several custom launchers – .APK files which allow Android users to change the way their device behaves once the menu button is tapped. Gesture Control is another root-level application, which also installs as an .APK file. Once you have indeed installed it onto your Honeycomb device, you can incorporate even more multi-touch gestures than ever previously possible to perform basic tasks.
Supported gestures, as per the XDA forums, include:
Swipe 2 fingers from top downward
pinch 3-4 fingers
Show/Hide status bar:
Swipe 2 fingers from bottom upward
There are two types of gestures in computing interfaces;
* Offline gestures: Those gestures that are processed after the user interaction with the object. An example is the gesture to activate a menu.
* Online gestures: Direct manipulation gestures. They are used to scale or rotate a tangible object.
Gesture Control is free of charge, and will require your device to be rooted in order to install. Please head over to the original thread for help, discussion, and suggestions regarding Gesture Control.