GESTE: Sign Language Translator

Project Team
Sameen Ahmad
Theresa Le
Matias Liu Schmid
Jake Ulbrich
Anthony Yalong
Project Mentor(s)
Sarah Morin
Instructor(s)
Dr. Amir Aslani, ECE, GW Engineering
Dr. Xiaodong Qu, CS, GW Engineering
Dr. Tim Wood, CS, GW Engineering
GESTE is an interdisciplinary ECE and CS project that tackles the accessibility gap between hard-of-hearing individuals and the hearing population by developing a sophisticated sign language translation system. It aims to convert German Sign Language (DGS) gestures into coherent written English text through advanced machine learning techniques with the use of Meta Aria Glasses, thereby reducing the dependency on human interpreters and overcoming limitations in current translation technologies.
Who experiences the problem?
The problem predominantly affects the hard-of-hearing communities, as well as anyone who interacts with them in educational, professional, or social settings. Families, educators, healthcare providers, and employers often encounter communication barriers due to the lack of readily available, accurate, and real-time sign language translation services.
Why is it important?
Addressing this problem is crucial because effective communication is a fundamental human right and a cornerstone of inclusion. By bridging the communication gap, the project not only enhances social participation and educational opportunities for the hard-of-hearing individuals but also fosters a more equitable society where language barriers do not impede access to essential services and interpersonal connections.
What is the coolest thing about your project?
One of the coolest aspects of the project is its integration of cutting-edge machine learning with wearable technology, notably the use of Meta Aria Glasses. These glasses serve as an innovative interface, capturing sign language inputs that are then processed to deliver seamless translations. This blend of advanced deep learning, video processing, and wearable tech truly sets the project apart, paving the way for more intuitive and accessible communication solutions.
What sustainable design considerations drove your solution?
Sustainable design considerations are integral to the project, particularly in terms of energy efficiency and hardware optimization. The system is designed to operate on energy-efficient embedded devices like the NVIDIA Jetson, ensuring minimal power consumption while maintaining high performance. This focus on optimizing energy usage not only extends battery life for wearable components but also contributes to a reduced environmental footprint in large-scale deployments.
What were some technical challenges?
The project faces several technical challenges, including the need to accurately recognize and translate continuous sign language gestures in real time. Handling the high-dimensional data from multi-modal inputs such as facial expressions, hand movements, and body posture requires sophisticated deep learning architectures