Project Definition:
This project aims in implementing real time gesture
recognition. The primary goal of the project is to create a system that can
identify human generated gestures and use this information for device control.
The user performs a gesture in front of a camera,
which is linked to the computer. The picture of the gesture is then processed
to identify the gesture indicated by the user. Once the gesture is identified
corresponding control action assigned to the gesture is actuated.
Scope:
In this system, we basically recognize hand gestures
through software and these hand gestures will be used for controlling certain
software as well as hardware devices.
Objective:
·
Provide a more
natural Human Computer Interface.
·
Provide the
physically challenged users a better way to interact with the computers.
·
Interfacing user
with software.
·
User can give
input without using keyboard, mouse, etc .i.e. to give input in the form of different body gestures.
·
Helpful for
handicap people.
·
Actually finds
out the directional gesture vector.
·
Direction as well
as count both will be responsible for the hardware control.
E.g.
Advanced Hardware Control, Robot Control.
Relevant theory:
We propose a fast algorithm for automatically recognizing a limited set
of gestures from hand images for a robot control application. Hand gesture
recognition is a challenging problem in its general form. The algorithm is
invariant to translation, rotation, and scale of the hand.
We demonstrate the effectiveness of the technique on
real imagery. Vision-based automatic hand gesture recognition has been a very
active research topic in recent years with motivating applications such as
human computer interaction (HCI), robot control, and sign language
interpretation. The general problem is quite challenging due to a number of
issues including the complicated nature of static and dynamic hand gestures,
complex backgrounds, and occlusions. Attacking the problem in its generality
requires elaborate algorithms requiring intensive computer resources. What
motivates us for this work is a robot navigation problem, in which we are
interested in controlling a robot by hand pose signs given by a human. Due to
real-time operational requirements, we are interested in a computationally
efficient algorithm.
Early approaches to the hand gesture recognition
problem in a robot control context involved the use of markers on the finger
tips. An associated algorithm is used to detect the presence and color of the
markers, through which one can identify which fingers are active in the
gesture.
Device Control Using Hand-Gesture Recognition
Project Definition:
This project aims in implementing real time gesture recognition. The primary goal of the project is to create a system that can identify human generated gestures and use this information for device control.
The user performs a gesture in front of a camera, which is linked to the computer. The picture of the gesture is then processed to identify the gesture indicated by the user. Once the gesture is identified corresponding control action assigned to the gesture is actuated.
Scope:
In this system, we basically recognize hand gestures through software and these hand gestures will be used for controlling certain software as well as hardware devices.
Objective:
• Provide a more natural Human Computer Interface.
• Provide the physically challenged users a better way to interact with the computers.
• Interfacing user with software.
• User can give input without using keyboard, mouse, etc .i.e. to give input in the form of different body gestures.
• Helpful for handicap people.
• Actually finds out the directional gesture vector.
• Direction as well as count both will be responsible for the hardware control.
E.g. Advanced Hardware Control, Robot Control.
No comments:
Post a Comment