Human Facial Action Unit Tracking
This project is for disabled users and industries where they can collaboratively work. In a disabled user life, they wanted to groom themselves. Some of them practice a lot in their specific fields and some choose to do practical work in different software houses. From the industrial side, they hire
2025-06-28 16:27:43 - Adil Khan
Human Facial Action Unit Tracking
Project Area of Specialization Software EngineeringProject SummaryThis project is for disabled users and industries where they can collaboratively work. In a disabled user life, they wanted to groom themselves. Some of them practice a lot in their specific fields and some choose to do practical work in different software houses. From the industrial side, they hire professional developers or engineers to solve their problems and sometimes they hire people from outside to work with them in their existing or upcoming projects. Those professionals are quite expensive and demand high wages for that collaborative work. We are trying to help both the industries and disabled users to work together on a real-life problem faced by the industry. Most of the current work on automated facial expression analysis attempt to recognize a small set of prototypic expressions, such as joy and fear. Such prototypic expressions, however, occur infrequently, and human emotions and intentions are communicated more often by changes in one or two discrete features. To capture the full range of facial expression, detection, tracking, and classification of fine-grained changes in facial features are needed. The system includes two main modules to extract feature information: NODDING and SHAKING. The nodding and shaking modules extract the facial information that classify it into FACS action units which further performs actions based upon the collected AU information
Project ObjectivesAnything come into being is not purposeless; some are beneficial while some are speculative. To introduce a Real-Time Human Facial Action Unit Tracking seems the best idea till now. This functionality adds the advancement in gadgets for people and makes them easier to use. As well as it is very helpful for handicap people. For themselves, they get to perform different functionalities based on the motion of their facial action units. Different software companies or operating system companies can add this feature to increase their market sale as the numbers of users have also increased in number.
Project Implementation MethodThe major steps involved in automatic facial movement system are discussed. The general facial movement system contains the basic two steps that are described below.
The feature extraction is a very important step of facial expression and gender recognition system. It can be divided into sub-steps like dimensionality reduction, feature extraction, and feature selection. Facial features extraction methods can be divided into two main categories according to their extraction method. First is deformation based and second is motion based. Some methods act locally and some act holistically. Local approaches focus on subparts of the face (lips, eyes, etc.) which caused for changing while in holistic approaches the face is treated as a whole. Some important facial feature extraction techniques are defined. The facial feature extraction methods can be categorized as geometric-based methods and appearance-based methods. The geometric-based methods detect facial features using shape information i.e., facial points or locations of eyebrows, eyes, mouth, and nose. These methods ignore the texture information of the face. The face geometry is represented in the feature vectors that are extracted from that shape information. Appearance-based methods use the pixel values of the facial image that refers to the textural changes. Using this method, the facial features can be extracted from the particular region of the face or from the entire face according to the need.
After the extraction the features from the image the next step is to classify them into different classes and recognize the facial expression and gender. Multiple methods have been proposed for this purpose like Artificial Neural Network (ANN), Support Vector Machine (SVM), Hidden Markov Model (HMM) Partial Least Square (PLS) and AdaBoost as a classifier.
Benefits of the ProjectImpaired persons usually face a lot of problems while operating the computer system. They face difficulties in giving the commands to the system. As the user can be handicap and can have various physical difficulties. To overcome this problem, they should have the supported functionalities within their operating system to aid them in their working.
Impaired users can get the benefit from this project. There are only few platforms in the world that are offering real time facial tracking but they are not offering functionalities of NODDING and SHAKING which makes our project unique and a smartest way to aid the impaired users.
It can be customized according to the users required functions to be performed on the collected action units.
Technical Details of Final DeliverableThe final deliverable will be capable to detect facial action units of the user and will perform the functionalities based upon the NODDING and SHAKING features that are extracted from the action units. The technical requirements will be a webcam installed on a system. The system will automatically detects the action units and will perform its working based upon them.
Final Deliverable of the Project HW/SW integrated systemCore Industry ITOther Industries Others Core Technology OthersOther TechnologiesSustainable Development Goals Decent Work and Economic Growth, Industry, Innovation and InfrastructureRequired Resources| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| Total in (Rs) | 50600 | |||
| Webcam | Equipment | 1 | 3600 | 3600 |
| Laptop | Equipment | 1 | 47000 | 47000 |
| Others | Miscellaneous | 0 | 10000 | 0 |