Manjusha Choorakuzil, Unnikrishnan Parameswarany,Michael Buckleyz
Dept. of Computer Science and Engg.
University at Buffalo,SUNY
Amherst,Buffalo, New York 14226
Email : mchoorak@buffalo.edu,unnikris@buffalo.eduy,mikeb@buffalo.eduz
Abstract—Sign Language is the primary medium of communication for the hearing impaired and mute in our society. Like any good Sign Language, American Sign Language (ASL) does justice to its job in expressing flow of thoughts and emotions as good as any spoken language. It involves the motion of hands and body posture as well as facial expressions.The primary aim of this paper is to recognize dynamic gestures in American Sign …show more content…
Of late, lot of research has been going on in the area of 3D geometric processing of sequence of images. In our project, we captured a sequence of 3D ASL gestures using the Prime Sense sensor and proposed a novel trajectory based method of classification of ASL gestures using the method of
Axis of Least Inertia (ALI).Feature Extraction is done using both local features and global features of a gesture. The ALI method of feature extraction is applied to global features while local feature extraction is done using the distance between each of the fingertips to the centroid. Integrating the results of local and global recognition improved the classification accuracy and system performance. Other applications of the system include using the system as a sign language tutor and in cafeterias, ticket counters and other public places where systems could be employed to accept gesture input from the users to process the needs of a person with hearing disability.
Keywords—Gesture Recognition, American Sign Language,
Axis of Least Inertia, Prime Sense sensor, Principal Component
Analysis, Perceptual