A humanoid robotics is a new challenging field. To co-operate with human beings, humanoid robots not only have to feature human like form and structure, but more importantly, they must prepared human like behavior regarding the motion, communication and intelligence .This project will be built in Pa
Real Time Sign Translator (Human Robot For Normal And Hearing In pained Person)
A humanoid robotics is a new challenging field. To co-operate with human beings, humanoid robots not only have to feature human like form and structure, but more importantly, they must prepared human like behavior regarding the motion, communication and intelligence .This project will be built in Pakistan is set to be one of the humanoid project, the controlling of arms which will display as well as produce audio on speakers. A vast network of sensors would be used for self-balancing of arms, obstacle avoidance and for specific path following. Camera vision patterns would be used in WEB camera for image processing. The other part of this project is real time object detection through newest technology YOLO. YOLO is a social robot designed and developed to stimulate creativity in children through storytelling activities. Children use it as a character in their stories. Yolo architecture is more like FCNN (fully convolutional neural network) and through deep learning it can recognize poor behaviors in seconds which would be used for high level security. The design of this project is for functional purposes, such as interacting with humanoid robot, robotic arms working by sign language and different experimental purposes. Generally these gestures arms on Robot creates an easy way for the communication with the special people by these robotic arms. The Hand Gestures also remove barrier between talking’s of patients or special people who can’t listen or speak.
The main idea of the project is to provide the interface between normal person and special person.
From this project normal person can understand what the special person wants to say and special person can also understand by the help of gestures.
Basically, we are going to implement a system that recognizes Gesture input using webcam & performs the Specified Operation. Hand gesture recognition system is used for interfacing between Special persons and Normal persons using hand gesture. This application uses the webcam to detect gesture made by the user and perform basic operations accordingly. The user has to perform a particular gesture. The webcam captures this and identifies the gesture, recognizes it (against a set of known gestures) and performs the action corresponding to it.
This project is beneficial for those people who can’t listen(deaf persons) , they can easily understand by the help of gestures . and also for those who can’t speak, they will show their gestures in front of webcam and after detection some text will be generated according to it.
Target market for this application is to provide the communication between normal and special persons, who can’t afford the gloves. It eliminates the necessity of glove all the time because no one can wear that gloves all the time. So it’s an easy process for communication that only stand in front of robot and the robot will convert helpful for deaf and dumb people.
The basic technology for this Robotic is users can use it without physically touching it, only the hand gestures are required from the special persons.
Our commercial partners are those who are interested to implement our project in hardware.
For example:
Our potential customers are special persons, because project are specifically for those persons who are not able to speak and listen.
|
SPECIAL PERSON TO NORMAL PERSON:
|
 _1582927189.png)
NORMAL PERSON TO SPECIAL PERSON:
SPECIAL PERSON TO NORMAL PERSON:
 _1582927191.png)
 _1582927192.png)
The special word referred to the person who was either deaf or dumb using sign language as a source of communication or both deaf and unable to speak. This term continues to be used to refer to the person who is deaf but has some degree of speaking ability [1]. The culturally Deaf people who use sign language for communication. According to world federation of the deaf (WFD) over 5% of the world’s population (?360 million people) has disabling hearing loss including 328 million adults and 32 million children [2]. A person with severe or profound hearing loss can have a severe problem in speech development and usually relies on sign language as a source of communication [3].
Deaf people face many irritations and frustrations that limit their ability to do everyday tasks. Research indicated that Deaf people, especially Deaf children, have high rates of behavioral and emotional issues in relation to different methods of communication. Most people with such disabilities become introverts and resist social connectivity and face-to-face socialization. The inability to speak with family and friends can cause low self-esteem and may result in social isolation of Deaf person. It is not only that they lack social interactions but communication is also a major barrier to Deaf person healthcare. In such conditions, it becomes difficult for the caretaker to interact with the deaf person.
Many communication channels are available, through which Deaf person can deliver their messages, e.g., notes, helper pages, sign language, books with letters, lip reading, and gestures. Despite these channels, there are many problems which are encountered by Deaf person and normal person during communication. The problem is not confined only to a Deaf person who is unable to hear or speak, but another problem is lack of awareness of Deaf culture by normal people. Majority of hearing people have either no/little knowledge or experience of sign language . There are also more than 300 sign languages and it is hard for a normal person to understand and become used to these languages. The above-mentioned problems can be solved by involving the assistive technology as it can be used as an interpreter for converting the sign languages into text or speech for better communication between the Deaf community and hearing individuals.
Finally, Python based program is produced which can make voice reognition, motion capture and convert both of them to each other. So a deaf person easily speaks to in sign language in front of motion sensor, the person behind the screen can understand easily without ability to speak sign language and vice versa.
By using this technology we are help for teaching a small children who is special and have a problem of hiring and listening.
| Elapsed time in (days or weeks or month or quarter) since start of the project | Milestone | Deliverable |
|---|---|---|
| Month 1 | Week 01 – 04 | Getting expert advice and talking about drawbacks.Finalizing the project idea noticing the benefits and making of proposal. |
| Month 2 | Week 05 – 08 | Making up with the team and assigning the individual task relating to software and hardware structure and coping with the results. |
| Month 3 | Week 09 – 13 | Analyzing the nature of tasks and planning them accordingly. |
| Month 4 | Week 14 – 17 | Buying all the items and equipments, keeping in view the block diagram, models, and the schematic diagram and start working on the project. |
| Month 5 | Week 18 – 22 | Working on hardware structure. It includes core-i7 board and pair of robotic hands. |
| Month 6 | Week 23 – 26 | Working on software and hardware structure. It includes YOLO, hoverboard, pair of robot’s arms. |
| Month 7 | Week 27 – 30 | Making of the Project Poster.Working on software and hardware structure and preparing the skeleton of the robot. Including the head mechanism and control. |
| Month 8 | Week 31 – 34 | Working on software and hardware structure. And integrating all the parts of robot with the skeleton, heads, legs etc and the controller (Raspberry Pi & Corei7 minicomputer). |
| Month 9 | Week 35 – 38 | Making of the Project Report. |
| Month 10 | Week 39 – 43 | Working on software and hardware structure. And integrating all the parts of robot with the skeleton, controller (Raspberry Pi & Corei7 minicomputer) and the remote control unit. |
| Month 11 | Week 44 – 47 | Working on software and hardware structure and bringing all the required improvements in our system. |
| Month 12 | Week 48 – 52 | Finalize the project.Trial Run. |
This project presents the detailed steps for designing and developing an autonomous robot....
In our startup idea, we are generating energy through a conventional treadmill, which is e...
Our goal is to provide a platform between the doctor and patient so that a patient can con...
The target is to design and construct a robot whose function will be a security system tha...
Booklo is a mobile application that works for the reselling of and buying of used as well...