We aim to make the communication for deaf/mute people easy and hence we?re developing a sign interpreter, which automatically converts sign language into audio output. For the deaf/mute people, sign language is the only way of communication. With the help of sign language, they can express their tho
Smart Gloves for Sign Language Communication
We aim to make the communication for deaf/mute people easy and hence we’re developing a sign interpreter, which automatically converts sign language into audio output. For the deaf/mute people, sign language is the only way of communication. With the help of sign language, they can express their thoughts to other people. But the majority of population can’t understand sign language, and this creates problems and difficulties for mute person or people that use sign language to communicate, as they are unable to convey their message or plainly communicate.
Therefore, research into sign language interpretation using gestures has been explored progressively during recent decades to serve as an auxiliary tool for deaf and mute people to blend into society without barriers. It consists of various movements and gesture of the hands and therefore getting the right accuracy at a low-cost is a mammoth task. In this project we are proposing to create a smart sign language interpretation system using a wearable sensor-based hand gloves. This wearable system will utilize 10 bend-sensors, multiple inertial motion sensors, and various types of contact sensors, Bluetooth transmission/receiver modules, microcontroller units and smart phone device. The glove-based device is able to read the movements of gesture performed by a person Based on the combination of these sensors, the device is able to identify any particular gestures that correspond to words and phrases in American Sign Language (ASL), and translate it into speech via a speaker and text which is displayed on screen.
| The device works by detecting the respective values in accordance to the amount of deflection or bending generated by each bend-Sensor attached to each finger of hand, also the orientation and movement of hands is detected b accelerator/gyroscope modules, so when a gesture is performed in accordance to sign language, the device detects the values of sensors and then these value are computed by a microcontroller which in turn then decides or identifies through conditional algorithms which gestures has been performed according to the respective values of sensors, after identification it the Bluetooth module will send signal to a connected smartphone device in which the dedicated app will play and display corresponding audio and text respectively to the performed gesture. The proposed sign interpretation system is divided into three distinct modules: a wearable-sensor module, processing module, and application module, as shown in below figure. The sensor and processing modules are implemented in the smart wearable hand device, whereas the application module operates in an Android-based mobile device.
|
The device works by detecting the respective values in accordance to the amount of deflection or bending generated by each bend-Sensor attached to each finger of hand, also the orientation and movement of hands is detected b accelerator/gyroscope modules, so when a gesture is performed in accordance to sign language, the device detects the values of sensors and then these value are computed by a microcontroller which in turn then decides or identifies through conditional algorithms which gestures has been performed according to the respective values of sensors, after identification it the Bluetooth module will send signal to a connected smartphone device in which the dedicated app will play and display corresponding audio and text respectively to the performed gesture. The proposed sign interpretation system is divided into three distinct modules: a wearable-sensor module, processing module, and application module, as shown in below figure. The sensor and processing modules are implemented in the smart wearable hand device, whereas the application module operates in an Android-based mobile device.

Importance of creating such device is that it can make mute/deaf people with verbal disabilities to become active and productive members of society, this can also be helpful and useful for any other group of people that have speech disorders which requires use of sign language, It can increase their chances to get good education, along with opportunities to be hired by organizations or just plainly ease their inability a bit and make them just like rest of us. Some other uses of this design include Virtual Reality interface, precise Controlling of Robotics, Telesurgery, Teaching of Sign language and much more.
The final deliverable outcomes of the device are purely based on our designing under low cost so that we can easily be promote our system in the sectors and implement the system which have same features as the costly one has(after market gesture recognition systems such as: DG5, Nintendo powerglove etc). The technical details of our system are based on the terminologies of sensor values with the consideration of minor things such as finger flexion, hand orientation, hand velocity, and signal processing and pattern recognition under the influence of machine learning.
Health Safety is a major issue in current era. Carbon monxide poisoning is the m...
To build an online marketplace which lets people lent or borrow out their goods with their...
This project aims to represent a mini version of robot with raspberry pi and Arduino UNO w...
Project Summary The Miniature Leaning Management System is the digitalized learning system...
The main objective of this report is to present our idea of developing a Smart Home Energy...