Hand gesture controlled drone using machine learning

Drones are conventionally controlled using joysticks, remote controllers, mobile applications, and embedded computers. A few significant issues with these approaches are that drone control is limited by the range of electromagnetic radiation and susceptible to interference noise. In this proposed pr

2025-06-28 16:32:49 - Adil Khan

Project Title

Hand gesture controlled drone using machine learning

Project Area of Specialization RoboticsProject Summary

Drones are conventionally controlled using joysticks, remote controllers, mobile applications, and embedded computers. A few significant issues with these approaches are that drone control is limited by the range of electromagnetic radiation and susceptible to interference noise. In this proposed project, the use of hand gestures as a method to control the drone will be use.

Computer vision-based methods rely on the ability of a drone’s camera to capture surrounding images and use pattern recognition to translate images to meaningful and actionable information.

Hand gesture controlled drone using machine learning _1582926542.png

The proposed framework involves a few key parts toward an ultimate action to be taken. They are: image segregation from the video streams of front camera, creating robust and reliable image recognition based on segregated images, and finally conversion of classified gestures into actionable drone movement, such as takeoff, landing, hovering and so forth.

Hand gesture controlled drone using machine learning _1582926543.png

Project Objectives Project Implementation Method

Hand gesture controlled drone using machine learning _1582926544.png

VIDEO STREAM IMAGE SEGREGATOR:  

The video stream image segregator constantly records through the on board camera of the drone and segmented it into sequences of still images.


HAND GESTURE RECOGNITION: 

Each image is then analyzed through the hand gesture recognition process, which includes three main steps:


COMMAND MAPPER: 

A command mapper transforms the detected gesture into a command, such as a take-off, land, or back off.


ACTION PLANNER: 

An action planner takes the command as its input and computes the corresponding course of primitive actions to satisfy the command.

Benefits of the Project

The goal of this proposed project is to enable the hand gesture based control mechanism with maximum possible accuracy even in mediocre drones which can be easily outperformed by the state-of-art drones due to their inbuilt high camera resolutions.

A set of experiments will be conducted to measure gesture recognition accuracies considering the major scene variability, illumination, background, and distance.

Classification accuracies using different machine learning classifiers will be analyze for well-lit, clear background, and within 3 ft gestures recognition.

Limitations of framework and feasible solutions for better gesture recognition will be studied.

Technical Details of Final Deliverable Final Deliverable of the Project HW/SW integrated systemCore Industry ITOther Industries Others Core Technology RoboticsOther TechnologiesSustainable Development Goals Industry, Innovation and InfrastructureRequired Resources
Item Name Type No. of Units Per Unit Cost (in Rs) Total (in Rs)
Total in (Rs) 77800
Quadcopter (Opensource UAV) Equipment13500035000
Raspberry Pi Module Equipment3650019500
Raspberry Pi Casing Equipment35001500
Raspberry Pi Power Adapter Equipment36001800
OverHead Expenditure Miscellaneous 150005000
Stationery Miscellaneous 130003000
Raspberry Pi Compatible LCD Equipment160006000
Sensors Modules Equipment310003000
Raspberry Pi Compatible Camera Equipment130003000

More Posts