This project deals with the maneuvering of the gun using hand gestures along with the locking and tracking of the target autonomously. The main idea of this project is to give a certain level of smartness and autonomy to the gun so that it can work accurately and efficiently without any physical con
Gesture Based Gun Maneuvering with Target Tracking
This project deals with the maneuvering of the gun using hand gestures along with the locking and tracking of the target autonomously. The main idea of this project is to give a certain level of smartness and autonomy to the gun so that it can work accurately and efficiently without any physical contact of operator with the gun or even in the absence of the operator as well by automatically detecting, locating, locking and tracking the target and intelligently deciding whether to attack the locked target or not. The objective of the project will be achieved by implementing a gesture controlled mechanism which will allow the user to control the movements of a gun by staying far away and hidden from the surroundings of weapon. This will be done by using the computer vision and image processing techniques to extract the information from the gestures of the user. Similarly, the decision of engaging or unlocking the target will be based on the detection of some specific pattern. In addition, there will be a video feedback from the action scene where the gun is mounted which will be wirelessly transmitted to the user so that he can take the decision of moving gun on the basis of what is going on in the surroundings of the gun.
The main objective of this project is first to change the traditional concepts of gun control by introducing gesture-based operations of the gun and then to introduce smartness in guns by auto tracking, engaging and locking of the target. The gesture-based operations of gun will ultimately lead to saving precious human lives which are always in danger while in a direct contact with the gun. Similarly, the auto tracking, engaging and locking of target will lead to the accuracy and efficiency in the gun operations.
This project in divided in to three major parts which are:
• Gesture-based control of the horizontal and vertical motions of the gun
• Auto-tracking of the target
• Decision of hitting the locked target or not
These three parts are designed and implemented separately and at the end these parts are integrated to get the desired product. In the first part, the input gesture is given by the gun operator which is recognized by the gesture recognition algorithm and the corresponding movement signal for the motor is generated. This signal is encoded and wirelessly transmitted to the microcontroller which is controlling the motors. The signal is decoded by the microcontroller and the motors are moved according to the decoded signal. This is the manual mode of the gun.
In the second part which is auto-tracking of the target, the target is first detected and then tracked by the tracking algorithm. The signals for the motors are generated by the tracking algorithm according to the movement of the target and then the gun is moved in the direction of target and continuously follows the target. This is the auto mode of the gun. In the third part, the decision is made by the gun whether to hit the target in the auto mode or to leave the target. This decision is made by the gun after classifying the target as enemy or the friend. If the target is friend then gun will unlock it and if the target is enemy then gun will fire at him.
The applications of this project are vast and the techniques used in this project can be molded for many different purposes. Some of the applications of this project are given below with a brief explanation.
1 Surveillance of Sensitive Organizations
This project allows the auto operation of the gun without the need of the operator so it can be used for surveillance of the sensitive organizations where a small human error can lead to the catastrophic results. The features of this project like auto tracking, engaging and locking of target can be used for monitoring such organizations.
2 Domestic Security Purposes
Since this project allow the gesture-based and the auto operation of the gun so it can be used for domestic security purposes where the users do not need to how to operate the gun. All they have to do is to give the input signal to the gun i.e. in which direction the gun should move and by how much degrees.
3 Alternative for Injured and Disabled Soldiers
The gesture- based operation of the gun in this project allows the soldiers to use the gun even after having the injury or a disability so it is the best alternative for such injured and disabled soldiers.
The final working prototype of this project is a gun mounted on the frame which moves in the horizontal as well as vertical direction. There is a camera mounted on the gun which gives the live feedback of the action area. Similarly, there is a laser present on the gun which act as fire indication instead of the actual fire of the gun. The laser will on and off indicating the fire of the gun.
| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| Raspberry pi 3B+ | Equipment | 1 | 10000 | 10000 |
| Gun frame | Equipment | 1 | 20000 | 20000 |
| Pi camera | Equipment | 1 | 3000 | 3000 |
| Motors | Equipment | 2 | 1000 | 2000 |
| motor drivers | Equipment | 3 | 1500 | 4500 |
| Total in (Rs) | 39500 |
The Shrimp structure rover robot is highly suitable for planetary exploration missions bec...
Executive Summary This proposal aims to present a basic overview in the area of image cla...
The objective of this paper is to design, develop and monitor ?Tablet filling and capping...
In recent years, wearable devices have received significant attention due to its widesprea...
The flow power networks will encounter a significant change in the coming years. The new a...