Date of Award

Fall 2006

Document Type

Thesis

Degree Name

Master of Science in Biomedical Engineering - (M.S.)

Department

Biomedical Engineering

First Advisor

Richard A. Foulds

Second Advisor

Sergei Adamovich

Third Advisor

Bruno A. Mantilla

Abstract

Signed languages develop among deaf populations and employ manual communication instead of voiced communication. Stokoe attributes classify individual signs in American Sign Language (ASL) and include handshape, hand location, movement, orientation, and facial expression. Signed and oral languages are not mutually understood, and many deaf individuals live in linguistic isolation. This research addresses computer translation between signing and speech, investigating sign duration in sentence context versus in isolation and identifying kinematic sign markers. To date, there has been little study of continuous signing kinematics; it was previously unknown if kinematic markers existed.

Kinematic data were collected from a proficient signer with electromagnetic Flock of Birds® sensors (position! orientation of both wrists) and CyberGloves® (18 joint angles/ hand). The data were collected for each sign in isolation and in sentences.

Mean sign duration decreased in sentence context due to coarticulation. There was evidence of finger joint and wrist velocity coordination, synchronicity and hand preshaping. Angular velocity maxima and minima indicated differentiation between handshapes. Minima in the wrists' tangential velocity signified Stokoe locations, and maxima indicated movement (sign midpoints or transition midpoints), which can serve as anchors in the segmentation process. These segmented locations and movements can be combined with handshape and wrist orientation to identify likely signs based on the kinematic database developed at NJIT.

Share

COinS