Document Type
Thesis
Date of Award
Summer 8-31-2008
Degree Name
Master of Science in Biomedical Engineering - (M.S.)
Department
Biomedical Engineering
First Advisor
Richard A. Foulds
Second Advisor
Sergei Adamovich
Third Advisor
Max Roman
Abstract
Sign language for the deaf and hearing impaired replaces speech with manually produced signs. Each sign has been categorized as being combinations of handshape, movement, orientation, location, and facial expressions. Of the five sign parameters, this thesis focuses on classification of two of the main parameters, the hand shapes and locations, in continuous signing.
Since the nature of hand shapes is transient and not static, neural networks was used as a classifier for hand shapes. And since locations in sign language are defined by linguistic variables rather than by hard core position values, fuzzy logic was used as a classifier for locations. Two models have been developed using neural networks and fuzzy logic toolboxes in Matlab that showed the possibility of classification of hand shapes and locations, in continuous signing.
Results show that neural network was able to classify hand shapes accurately at every instant when tested with the trained data and reasonably well with testing data. The proposed model for classification of locations was able to classify locations accurately.
Recommended Citation
Karri, Swetha, "Classification of hand held shapes and locations in continuous signing" (2008). Theses. 364.
https://digitalcommons.njit.edu/theses/364