Hand gesture controlled drone using machine learning
Drones are conventionally controlled using joysticks, remote controllers, mobile applications, and embedded computers. A few significant issues with these approaches are that drone control is limited by the range of electromagnetic radiation and susceptible to interference noise. In this proposed pr
2025-06-28 16:32:49 - Adil Khan
Hand gesture controlled drone using machine learning
Project Area of Specialization RoboticsProject SummaryDrones are conventionally controlled using joysticks, remote controllers, mobile applications, and embedded computers. A few significant issues with these approaches are that drone control is limited by the range of electromagnetic radiation and susceptible to interference noise. In this proposed project, the use of hand gestures as a method to control the drone will be use.
Computer vision-based methods rely on the ability of a drone’s camera to capture surrounding images and use pattern recognition to translate images to meaningful and actionable information.

The proposed framework involves a few key parts toward an ultimate action to be taken. They are: image segregation from the video streams of front camera, creating robust and reliable image recognition based on segregated images, and finally conversion of classified gestures into actionable drone movement, such as takeoff, landing, hovering and so forth.

- To develop a hand gesture controlled drone using machine learning.
- To investigate the use of computer vision methods to develop an intuitive way of agent-less communication between a drone and its operator.
- To identify all possible gestures that can be detected by the gesture capturing device, then assign them the drone’s functions.
- To use it for future purposes such as surveillance for sporting events, filming, and other activities.
- To develop an intuitive way of agent-less communication between a drone and its operator.

VIDEO STREAM IMAGE SEGREGATOR:
The video stream image segregator constantly records through the on board camera of the drone and segmented it into sequences of still images.
HAND GESTURE RECOGNITION:
Each image is then analyzed through the hand gesture recognition process, which includes three main steps:
- Feature extraction
- Hand region identification
- Gesture classification
COMMAND MAPPER:
A command mapper transforms the detected gesture into a command, such as a take-off, land, or back off.
ACTION PLANNER:
An action planner takes the command as its input and computes the corresponding course of primitive actions to satisfy the command.
Benefits of the ProjectThe goal of this proposed project is to enable the hand gesture based control mechanism with maximum possible accuracy even in mediocre drones which can be easily outperformed by the state-of-art drones due to their inbuilt high camera resolutions.
A set of experiments will be conducted to measure gesture recognition accuracies considering the major scene variability, illumination, background, and distance.
Classification accuracies using different machine learning classifiers will be analyze for well-lit, clear background, and within 3 ft gestures recognition.
Limitations of framework and feasible solutions for better gesture recognition will be studied.
Technical Details of Final Deliverable- Software Simulation results
- Comparative Study
- HW/SW integrated system
| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| Total in (Rs) | 77800 | |||
| Quadcopter (Opensource UAV) | Equipment | 1 | 35000 | 35000 |
| Raspberry Pi Module | Equipment | 3 | 6500 | 19500 |
| Raspberry Pi Casing | Equipment | 3 | 500 | 1500 |
| Raspberry Pi Power Adapter | Equipment | 3 | 600 | 1800 |
| OverHead Expenditure | Miscellaneous | 1 | 5000 | 5000 |
| Stationery | Miscellaneous | 1 | 3000 | 3000 |
| Raspberry Pi Compatible LCD | Equipment | 1 | 6000 | 6000 |
| Sensors Modules | Equipment | 3 | 1000 | 3000 |
| Raspberry Pi Compatible Camera | Equipment | 1 | 3000 | 3000 |