Adil Khan 9 months ago
AdiKhanOfficial #FYP Ideas

Facial Expression Detection of Handicapped person for Controlling of Virtual Keyboard

According to the study, nearly 1 out of 50 people live with paralysis, approximately 5.4 million people, and most can only change their facial expressions. These people live a boring life because of not direct interaction with computers and the internet or no particular activity, making them happy i

Project Title

Facial Expression Detection of Handicapped person for Controlling of Virtual Keyboard

Project Area of Specialization

Artificial Intelligence

Project Summary

According to the study, nearly 1 out of 50 people live with paralysis, approximately 5.4 million people, and most can only change their facial expressions. These people live a boring life because of not direct interaction with computers and the internet or no particular activity, making them happy in their spare time. Hands-free text entry with an on-screen virtual keyboard has been long possible using eye-tracking technology. It is difficult for a person to type with it because it needs more focus, and it also requires a good quality camera or eye tracking device that can detect its eyeball movement

Our Project Facial Expression Detection of Handicapped Person For Controlling of Virtual Keyboard (FEDCoVK) is mean of communication b/w handsfree or paralyzed person (the only who can change the facial expression) and Computer. The one can use the Computer with the help of virtual keyboard for playing games and writing docs just by changing thier facial expression. we see that every 1/50 person suffered from this kind of situation that they can't use computer. But here we provide a way for those people to use computer. One can close r open thier eyes, lift his/her eyebrows, expand lips, open mouth , move head etc to control the virtual keyboard  for typing and playing games. The main concern of this project is to reduce inequality b/w people and provide good life to paralyzed person. 

For Example: person can play games by controlling arrows keys with thier head movement up,down, left ,right  and one can write on screen by opening mouth if blinking is on specfic key which he want to type, or one can move backward on keyboard to write another alphabet by closing his/her left eye, one can press spacebar by lifting eyebrow etc

Project Objectives

Our project objective is to biuld virtual keyboard which Handicapped person can control by changing his/her facial expression. Through this one is able to play games and type documents. we add head movements for controlling arrow keys, right eye close for enter key, open mouth to press one of the alphanumeric keys for typing, lift eyebrows for spacebar, expand lips for backspace and left eye close to move backward on alphanumeric keys to type another alphabet. We do this to get better and faster result so that one cannot get tired after using this.

Project Implementation Method

Recent advances in machine learning yielded new techniques to train deep neural networks, which resulted in highly successful applications in many pattern recognition tasks. We make Datasets by ourself. First image processing will be done on Dataset to increase contrast and noise reduction. CNN architecture will be used to extract human face features for detection of Facial expression.. Then we make a virtual keyboard which we link with each Facial expressions to perform action real time video capturing and show labeling and marks points on face real time on user for user easiness.
Image processing?Feature Extraction? Detection?Cnn Training?Recognition?perform action on keyboard

Benefits of the Project

This project provide an important benefits for paralyzed person to use technology with less resource, cost and time . A person just need to buy a camera for facial expression detection which is not costly. rather than buying special eye ball tracking camera for eyetracking technologies . Give output very fastly and person does'nt get tired after using this. 

Technical Details of Final Deliverable

The aim of this project is to build virtual keyboard for handicapped persons. For this we build face landmark dataset and then using CNN multilayer algorithm we train our dataset.
This trained dataset will detect facial expression then handicapped person will easily type and play games using facial expression.

Final Deliverable of the Project

Software System

Core Industry

IT

Other Industries

Core Technology

Artificial Intelligence(AI)

Other Technologies

Sustainable Development Goals

Good Health and Well-Being for People, Reduced Inequality

Required Resources

Item Name Type No. of Units Per Unit Cost (in Rs) Total (in Rs)
GTX 1070Ti - 3D GeForce - 8GB GDDR5 - Graphic Card Equipment17000070000
Logitech C310 HD 720p Webcam Miscellaneous 180008000
Total in (Rs) 78000
If you need this project, please contact me on contact@adikhanofficial.com
Arduino based smart robot car controlled with Android APP

This is a smart robot car that is controlled with the android phone by installing android...

1675638330.png
Adil Khan
9 months ago
Design and Development of POC model of HPL based DEW in a limited doma...

Drones as every new emerging technology has introduced many threats and risks to human soc...

1675638330.png
Adil Khan
9 months ago
Waste water treatment using DCS

  Since water is a basic Human need but due to the shortage of clean water on Earth,...

1675638330.png
Adil Khan
9 months ago
Readable Artificial Eye

SUMMARY: "Readable Artificial Eye" is a artificial vision creator for blind peop...

1675638330.png
Adil Khan
9 months ago
Control and Automation of Electric Wheelchair using EEG and AI algorit...

Assistive robots can provide support for disabled people in daily and professional life, t...

1675638330.png
Adil Khan
9 months ago