Adil Khan 9 months ago
AdiKhanOfficial #FYP Ideas

Gesture Control Robotic Car with Object placing hand

Gesture recognition and teleportation are vastly applicable in the field of animation, avatar creation, and control. Technology nowadays takes roles to create virtual environments with virtual elements and able to work collaboratively with real-world objects. A tangible robot or machine can be

Project Title

Gesture Control Robotic Car with Object placing hand

Project Area of Specialization

Robotics

Project Summary

Gesture recognition and teleportation are vastly applicable in the field of animation, avatar creation, and control. Technology nowadays takes roles to create virtual environments with virtual elements and able to work collaboratively with real-world objects. A tangible robot or machine can be controlled intangibly through gestures with the help of technology. The idea of teleportation human gestures into a humanoid robot to train these robots and find out the learning outcomes comes from applying a LMC and nao robot. Leap motion is a medium to teleport gestures, and the nao robot is a target machine to verify teleportation (Staretu and Moldovan, 2016). Both devices are very much useful in elementary learning purposes (Mubin & Ahmad, 2016). Controlling a humanoid robot through gesture is a practical approach, although machine learning concepts used vastly to do this (Rodriguez et al., 2014). Generally, teleportation of gesture is a coordinate to coordinate mapping (Shamsuddin et al., 2012). Coordinate sets of human body joints are mapped to a robot body joints for teleporting gestures, and then the robot reacts according to human action. To simplify this coordinate to coordinate mapping, the LMC can quickly transfer human hand gestures into a humanoid robot and control it. The classified objective of the work was to teleport the human hand gestur

While gesture recognition may reduce the need for handheld devices, yet another avenue could lead to a vast infusion of specialized input devices. In the future, there may be custom devices for virtually every different type of activity in a virtual environment. It may be possible for some devices to change their shape to better mimic their real-world counterpart; for instance, a golf club prop might change the perceived center of mass to simulate a wood, iron, or putter. This may be particularly useful in training environments, where the participant can interact with the same devices they will use in the real world.

Project Objectives

The main objective is to develop a hand gesture based robotic vehicle with an end effector to pick and place objects controlled by nRF communication to increase the autonomy of physically impaired people by using MEMS technology. This is an attractive and compact embedded device with sensors to sense the activities of human hands and to capture motion information from accelerations in a form of analog voltage. This approach is focusing primarily on the task of grasping objects of different shapes and not that of manipulating or assembling objects. This type of grasping device has a variety of applications in object retrieval systems for the handicapped, planetary, underwater exploration and robotic surgery.
Also, to control the displacement of the robotic arm so that the arm can be used to pick and place objects from any source to destination.

The robotic platform and the robotic arm can be controlled by two separate accelerometers. One accelerometer is mounted on the human hand and another one mounted on the leg of the user capturing its gestures and postures and thus the robotic arm and the platform moves accordingly. The motions performed by the platform are forward, backward, right & left and the operations performed by the robotic arm are pick & place/ drop, raising and lowering the objects. The RF Module is used to transmit the different hand and leg gestures made by the user. The system is also equipped with an IP based camera which can stream real time video wirelessly to any Internet enabled device such as Mobile phone, Laptop, Tablet etc. The biggest advantage of this kind of robotic arm is that it can work in hazardous areas and also in the area which cannot be accessed by the human and also used to implement highly precise medical treatments

Project Implementation Method

The transmitter section consists of Arduino Uno Rev3 SMD, 3-axis accelerometer (ADXL 335), flex sensor, push button switches and nRF24L01 with 2.4GHz transmitter module. A separate 5 volt power supply which may be a battery source or through laptop or pc is applied to the arduino controller. The arduino Uno, flex sensor, push button switches and transmitter module receives 5V power supply.

ADXL 335 is a small, thin, low power, complete 3-axis accelerometer with signal conditioned outputs. It has 6 pins namely power supply (VCC), ground (GND), self-test (ST) and remaining 3 pins are for X, Y, Z axis. By tilting an accelerometer along its measured axis, one can read the gravitational force relative to the amount of tilt. The accelerometer can measure the force applied on the sensor in all the 3 directions X, Y and Z axis. The sensor provides three values X, Y and Z which are calibrated for the four types of movement and a stopped position at center by use of the error values in the axis directions

The flex sensor also known as variable resistor but the resistance changes according to the bending i.e. flexing. It detects the bending movements and can be made unidirectional or bidirectional. It works on the principle of change of resistance. Flex sensor is basically a strip of carbon material having metal pads inside it which measures the amount of deflection caused by bending the sensor. Internally it consists of a carbon resistive element with thin substrate.

The Arduino Uno SMD is a version of Arduino Uno, but uses a surface mount version of the Atmega328P rather than the though-hole version. The board has 14 digital input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs, a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button

Robotic arm is the type of mechanical arm which is fixed to the receiver part of the circuit; the control signal came from the Arduino output based on the gesture input given in the transmitter part by using accelerometer sensor. The robotic arm can rotate about the axis of 360 degree, which is fixed to the one of the DC geared motor. The rotational control and the movement of the arm is based on the gesture input that is given from the transmitter section. The one part of hand is about 20 cm and another part of the arm (elbow part) which is 30 cm long. The certain load is applied at the end of the arm for lifting the objects of desired weight. The servo motor is used for the purpose of gripping and placing the objects, because the servo motor is used for the low torque application.

Benefits of the Project

Benefits of gesture recognition include improved safety — since drivers do not have to take their attention off the road as much as they would with touch controls — and the simple convenience of being able to control vehicle functions with deliberate gestures rather than a potentially complex menu scheme

The desktop computing paradigm limits the users' flexibility by forcing them to interact using a 2-Degree-Of-Freedom device (the mouse), while they are used to interacting with the physical world in much more differentiated ways (Bellucci, Malizia & Aedo, 2014). Gestures allow the user to handle multiple points of input and even define several parameters at once. They are, therefore, a more natural form of communication.

Unlike traditional buttons and menus, gestures do not interrupt the user's activity by forcing him to move his hand to the location of a command. Instead, they can be performed directly from the current cursor position. (Bau & Mackay, 2008)

Also, they do not require any additional devices: the command and even its parameters can be specified by a simple hand movement (Baudel & Beaudouin-Lafon, 1993). Input devices narrow down the user's possibilities of interaction, for example a pen or a mouse limiting the potential forms of input to single-touch interaction. Gestures that are performed with the user's hands however, can be versatile and do not have these constraints. As Wobbrock et al. put it: "almost anything one can do with one's hands could be a potential gesture" (Wobbrock, Morris & Wilson, 2009). This includes not only the movement or the followed path of the hand, but the movement and position of every finger as well as the general hand posture. (Brandl, Forlines, Wigdor, Haller & Shen, 2008)

Gestures feel very natural to perform since they mirror our experiences in the real world.

Maybe that is the reason a study by Watson et al. showed that participants using touch-input for a task were enjoying themselves more and also felt more competent compared to participants using a mouse. They systematically favoured direct touch input over mouse input and also performed better regarding speed and accuracy. (Watson, Hancock, Mandryk & Birk, 2013)

In addition, Cao, Ofek and Vronay found that gesture-controlled presentations were not only perceived as more enjoyable by the presenter but also as more attractive by the audience. The presenters were able to make eye contact more often and to use their body language to convey information.

Technical Details of Final Deliverable

The vehicle consists of four wheels which are connected to the DC geared motor and which is controlled by the switches that are present in the transmitter part. The switches have the four operations (Forward, Backward, Left and Right) respectively in the series manner. These switches are pressed by the users for their needs and it sends the voltage signal to the transmitter section Arduino. The nRF communication system is used for the transmission and receiving purpose. The Arduino checks the input of the system and sends a 4 bit code to the Encoder IC. The Encoder passes the data to nRF transmitter and the transmitted data is received by the nRF receiver. The receiver sends the 4 bit code to the Decoder IC and the decoder passes it to Motor Driver IC. Later the motor driver makes the decision to turn the two motors in the required direction based on the input key which is pressed. L239D is an interfacing device which is used to interface the wheels of the vehicles. In a single chip we interfaced 2 motors, we are using six DC geared motors and a servo motor for the vehicle.

The experimental setup of transmitter and receiver prototype is shown in figure 7 and 8 respectively. Experiments were done mainly in two parts. In the first part, we tested the transmitter module separately by connecting the Arduino controller to the PC/Laptop and observed the output for different movements of the accelerometer and bending of the flex sensor in the Arduino software (IDE). In the second phase we tested the receiver module by giving inputs from the transmitter and successfully observed the displacement of robotic arm, opening & closing of grippers and then the movements of the robotic platform in all possible four directions. From observation that has been made, it clearly shows that the movement of the robot is precise, accurate, and is very easy to control and moreover user friendly to use. The performance of the robotic arm and robotic vehicle was checked using different hand movements. The robotic arm design is very simple and has the ability to grasp light weight objects and also mimic the hand gestures almost flawlessly. This robotic arm can perform complex and hazardous operations with ease. Thus the assistive or supportive robot system was assessed positive in the subjective evaluation. The participants subjectively perceived that the hand gesture performance as easy i.e. robotic vehicle movements, robotic arm motions and opening and closing of grippers. This can be well suited for pick and place operations. Using this framework, a non expert robot programmer can control the robot quickly in a natural way.

Final Deliverable of the Project

Hardware System

Core Industry

Manufacturing

Other Industries

Core Technology

Robotics

Other Technologies

Sustainable Development Goals

Affordable and Clean Energy, Industry, Innovation and Infrastructure, Responsible Consumption and Production

Required Resources

Item Name Type No. of Units Per Unit Cost (in Rs) Total (in Rs)
Gesture control robot Equipment15000050000
Total in (Rs) 50000
If you need this project, please contact me on contact@adikhanofficial.com
MAJU Parking

In many areas where when people come to park cars or bikes people are sitting there who ta...

1675638330.png
Adil Khan
9 months ago
IOT based human resource tracking for enterprises

The project will be targeting commercial users whose basic purpose is to track the geograp...

1675638330.png
Adil Khan
9 months ago
Bookflix

Books can also be the most resourceful activity to learn various experiences and life skil...

1675638330.png
Adil Khan
9 months ago
Blood Group detection device

We are geard up to design a device that is capable of detecting the Human blood group. It...

1675638330.png
Adil Khan
9 months ago
Leap Motion Based Robotic Arm

As  intention  of  the  research  is  to  help  th...

1675638330.png
Adil Khan
9 months ago