Adil Khan 9 months ago
AdiKhanOfficial #FYP Ideas

Lecture Evaluation in Smart classroom using gestures and Facial expressions

Education is a vital aspect of our life because it builds the necessary foundation on how we can progress as a society. The world is getting increasingly complex, seamless and dynamic, and education is the vehicle for ensuring that we can navigate this complexity with understanding, collaboration an

Project Title

Lecture Evaluation in Smart classroom using gestures and Facial expressions

Project Area of Specialization

Artificial Intelligence

Project Summary

Education is a vital aspect of our life because it builds the necessary foundation on how we can progress as a society. The world is getting increasingly complex, seamless and dynamic, and education is the vehicle for ensuring that we can navigate this complexity with understanding, collaboration and problem-solving across cultures. Furthermore, language competencies and educators play a crucial role in ensuring that students are properly educated. According to Velasquez et al. (2013), numerous studies have indicated that a caring teacher can positively impact learning outcomes, motivation, and social and moral development. However, Chen et al. (2019) and Islam et al. 2016 mentioned that especially in universities, classrooms are often overcrowded with a large number of students which makes it strenuous for lecturers to monitor students’ reaction on the lecture being delivered, and obtain immediate feedback from the students in the classroom on whether they are able to follow the lecture being delivered.


 

Therefore, in view of this issue, This project approach to automatic estimation of attention of students during lectures in the classroom. Both facial and body properties of a student, including gaze point and body posture are used in order to observe how students perceive the lecture being delivered. Machine learning algorithms are used to train classifiers which estimate time-varying attention levels of individual students.

Project Objectives

  • In case of the present system the lectures are going regularly but main reason behind this 10-20% students get 80% knowledge, 20-60% students get 60% knowledge and remaining students do not get their concepts. To overcome this situation and provide honest feedback of lecturer there is need of lecture evaluation using face recognition.

  • Instead of using the conventional methods, this proposed system aims to develop an automated system that records the student's attention by using facial recognition technology. The main objective of this work is to make the lecture more effective.

  • The main purpose for monitoring Student’s attention is to collect information that will inform and facilitate improvement in classroom learning.

  • Monitoring student’s attention using cameras is a non-invasive approach of digitizing students' behavior. Understanding students' attention span and what type of behaviors will help to make teaching methods more effective for students.

Project Implementation Method

Step 1: Facial expression and gesture Detection

To begin, the camera will detect and recognize facial expression and gesture. The facial expression and gesture is best detected when the person is looking directly at the camera. The technological advancements have enabled slight variations from this to work as well.

Step 2: Face Analysis

Next, a photo of the facial expression and gesture is captured and analyzed. Most facial recognition relies on 2D images rather than 3D. Distinguishable landmarks or nodal points make up each face. Each human face has 80 nodal points. Facial recognition software will analyze the nodal points such as the distance between your eyes or the shape of your cheekbones.

Step 3: Converting an Image to Data

The analysis of your facial expression and gesture is then turned into a mathematical formula. These facial expression and gesture features become numbers in a code.

Step 4: Finding the Result

Your code is then compared against a database of facial expression and gesture. The breakdown of facial detections on the dashboard into the three most common facial expressions (bored, satisfied, confused) allows the lecturer to have a better observation of the students’ actual reaction to a lecture.

Benefits of the Project

Facial recognition technology can be programmed to recognize a wide range of nonverbal expressions and emotions. Through this, a professor can assess the emotion levels of the class to determine the parts of his lecture that are the most exciting and engaging, or where students’ attention appears to diminish. In this way, every unique face can function like a uniquely identifiable thumbprint that also speaks, through verbal and nonverbal data.

As class-engagement data of this sort comes in, week to week and semester to semester, faculty and administrators can partner to build new data models that unlock powerful insights into how students learn, what methods are most effective, and what differentiates great classes (and great teachers) from less-effective learning experiences.

Furthermore, as a student matriculates toward graduation one semester at a time, aggregate data can perhaps be used to discover learning strengths and areas of concern, enabling more tailored learning experiences that can lead each student to better outcomes.

Technical Details of Final Deliverable

Methods

The database used in the study consisted of facial expression images from the Cohn-Kanade database [16]. Two types of parameters were extracted from the facial image: real valued and binary. A total of 15 parameters consisting of eight real-valued parameters and seven binary parameters were extracted from each facial image. The real valued parameters were normalized. Generalized neural networks were trained with all fifteen parameters as inputs. There were seven output nodes corresponding to the seven facial expressions (neutral, angry, disgust, fear, happy, sad and surprised).

Based on initial testing, the best performing neural networks were recruited to form a generalized committee for expression classification. Due to a number of ambiguous and no-classification cases during the initial testing, specialized neural networks were trained for angry, disgust, fear and sad expression. Then, the best performing neural networks were recruited into a specialized committee to perform specialized classification. A final integrated committee neural network classification system was built utilizing both generalized committee networks and specialized committee networks. Then, the integrated committee neural network classification system was evaluated with an independent expression dataset not used in training or in initial testing. A generalized block diagram of the entire system is shown in Figure 1.

Figure 1

figure 1

mage Processing and Feature Extraction

Two types of parameters were extracted from the facial images of 97 subjects: (1) real valued parameters and (2) binary parameters. The real valued parameters have a definite value depending upon the distance measured. This definite value was measured in number of pixels. The binary measures gave either a present (= 1) or an absent (= 0) value. In all, eight real valued measures and seven binary measures were obtained.

Real valued parameters

Eyebrow raise distance– The distance between the junction point of the upper and the lower eyelid and the lower central tip of the eyebrow.

Upper eyelid to eyebrow distance– The distance between the upper eyelid and eyebrow surface.

Inter-eyebrow distance– The distance between the lower central tips of both the eyebrows.

Upper eyelid  lower eyelid distance– The distance between the upper eyelid and lower eyelid.

Top lip thickness– The measure of the thickness of the top lip.

Lower lip thickness– The measure of the thickness of the lower lip.

Mouth width– The distance between the tips of the lip corner.

Mouth opening– The distance between the lower surface of top lip and upper surface of lower lip.

Final Deliverable of the Project

HW/SW integrated system

Core Industry

Education

Other Industries

IT

Core Technology

Artificial Intelligence(AI)

Other Technologies

Others

Sustainable Development Goals

Quality Education

Required Resources

Item Name Type No. of Units Per Unit Cost (in Rs) Total (in Rs)
IP Camera 4MP Equipment11500015000
IPTV DVR Equipment11500015000
SSD 1TB Equipment11500015000
Cabels Equipment180008000
Adopters 12V DC Equipment210002000
LED Screen Equipment11500015000
Stationary Miscellaneous 150005000
Total in (Rs) 75000
If you need this project, please contact me on contact@adikhanofficial.com
Identification of miRNA based therapeutic compounds against Nonsmall C...

Problem Statement: Non-small Cell Lung Carcinoma (NSCLC) is the most common and aggressive...

1675638330.png
Adil Khan
9 months ago
VR Walkthrough Simulator for Customizable Buildings

Every day we come to know about some buildings which are taking months and years to comple...

1675638330.png
Adil Khan
9 months ago
IOT and PLC based Controlling and Monitoring of Grid Station Transform...

this project is related to automation and monitoring of grid station transformers accordin...

1675638330.png
Adil Khan
9 months ago
IOT BASED HOME SURVEILLANCE SYSTEM

IOT or internet of things is an upcoming technology that allows us to control hardware dev...

1675638330.png
Adil Khan
9 months ago
Child Security and Activities Monitoring System using IoT

In real world, the children safety is a huge question mark in everyone?s mind. Parents alw...

1675638330.png
Adil Khan
9 months ago