Gaze Estimation and Plotting During Driving and Other Activities

Driver distraction and inattention are one of the major factors that lead to vehicular accidents. The Global status report on road safety 2018, launched by WHO in December 2018, highlights that the number of annual road traffic deaths has reached 1.35 million. Cellphone or other e

2025-06-28 16:32:43 - Adil Khan

Project Title

Gaze Estimation and Plotting During Driving and Other Activities

Project Area of Specialization Artificial IntelligenceProject Summary

Driver distraction and inattention are one of the major factors that lead to vehicular accidents. The Global status report on road safety 2018, launched by WHO in December 2018, highlights that the number of annual road traffic deaths has reached 1.35 million. Cellphone or other electronic devices usage including in-vehicle infotainment are some of the most common causes of inattention. Distraction can also be caused by momentary lapse of attention e.g. looking into ‘wrong direction’, checking speedometer at ‘wrong’ time, etc.

Driver gaze and head pose are linked to driver’s current focus of attention So driver’s gaze direction is a critical information understanding driver state. Driver’s gaze direction is often sufficient in a number of applications, however, the challenge is to estimate gaze direction robustly in naturalistic real world driving.

Project Objectives Project Implementation Method

PROJECT METHODOLOGY

This project is distributed into two fragments:

Eye-Gaze tracking Mechanism:

            In this proposed project we will study the different eye gaze tracking methods and detecting the one with the most precise and optimal results based on their performance through different test cases. This part include detecting the face from image and extracting eyes and after that extracting pupil co-ordinates and estimate gaze direction robustly in naturalistic real world driving The extracted co-ordinates will be plotted on a 2-d graph. The procedure will work using image processing libraries like openCv2 using python.

Training Phase:

                 The best procedure or algorithm will then be implemented. We will be collecting real world driving dataset from at least 30-60 drivers including experts, newbies and normal drivers which will drive on a specific route. After that we will implement our eye gaze tracking mechanism on the extracted dataset to train our model.

Testing Phase:

In this phase we will collect data from an unknown driver on the same route on which we have trained our model and check the maturity level of the driver.

Benefits of the Project Technical Details of Final Deliverable

Dashboard Camera: The efficiency of the system will depend on the quality of the video picture. Since the system will work outdoor .it is difficult to analyse the picture capture in a sunlight. Eye-gaze tracking methods using corneal re?ection with infrared illumination have been primarily used in an indoor setting but are vulnerable to sunlight. Camera sensor should be enough capable to capture images noiseless and sharp as much as possible. Many of the existing methods use active IR illumination technique to obtain bright and dark pupil image.

Laptop: We will use Laptop to receive data send by the Camera through Wi-Fi which will be placed inside the car. Since the system will not send data over the net so it is required to place laptop and camera in the same car.

TEST CASES:

To determine the optimal solution, each procedure will be passed through non-identical test cases. Apart from the algorithm, the hardware (Camera, Wi-Fi and laptop) of the system also required to be optimized to provide a smooth experience. The performance and precision of each algorithm will be examine in non-identical scenarios based on the following proposed factors.

Car windscreen/ windscreen Sizes:

Bearing in mind the broad point of view of the human eye, it is essential to have different windscreen sizes for the different cars. They can differ based on car models like Sedan, Microcar, Minivan and Hatchback.

Speedo-Meter Position:

Considering a number of diverse classes of the cars it essential to have the dissimilar positons of the Speedo-meter .The results can vary based on the different Speedo-meter positions according to the driver’s gaze which will produce error in the output of the algorithms.

Tape Position:

Making an allowance for wide-ranging of diverse models of the cars it essential to have the unalike positons of the tape. The results can disagree based on the different tape positions.

Weather Conditions:

Assume we have trained our model in a sunny weather. The results will not be the perfect in cloudy weather. Results can vary based on the variety of weathers like rainy, partly cloudy and stormy weather.

Route Conditions:

Our system’s accuracy is highly dependent on the road conditions. Rough roads produce driver’s distraction and inattention. This will produce error in the result of maturity level of the driver.

Final Deliverable of the Project HW/SW integrated systemType of Industry IT , Transportation Technologies Artificial Intelligence(AI)Sustainable Development Goals Good Health and Well-Being for People, Life on LandRequired Resources
Item Name Type No. of Units Per Unit Cost (in Rs) Total (in Rs)
Total in (Rs) 59800
Dashboard Camera Equipment12500025000
SSD Equipment249009800
Arduino Nano Equipment11500015000
Printing and Banner Miscellaneous 2500010000

More Posts