Autonomous Navigation Differential Drive Robot

Design and construction of Autonomous Navigation robot with differential drive configuration that is able to map a guided area, generate the map of the area in real time onto command station and upon providing the desired destination, the robot is able to navigate to its final coordinates

2025-06-28 16:30:32 - Adil Khan

Project Title

Autonomous Navigation Differential Drive Robot

Project Area of Specialization RoboticsProject Summary

Design and construction of Autonomous Navigation robot with differential drive configuration that is able to map a guided area, generate the map of the area in real time onto command station and upon providing the desired destination, the robot is able to navigate to its final coordinates using advance control theory whilst avoiding obstacles. 

Project Objectives

design an autonomous navigation robot that is able to navigate to the desired location in a dynamic environment using low-cost sensors. our objective is to understand modern autonomous navigation systems and implement on a small scale to test and tune for best land navigation control. another reason to undertake this project is to gain the ability to formulate a robot mathematical model. finally, upon completing this project we will be able to design an autnomous navigation robot for any configuration based on the task.

Clearpath Robotics

Project Implementation Method

Simultaneous Localization and Mapping is the computational problem of constructing or updating a map of an unknown environment. It allows a robot to navigate indoor and outdoor autonomously. There are many approaches to SLAM such as FAST SLAM, GraphSLAM, PointCloud, ORB-SLAM and many more. This technique depends on hardware capability. For our project, we intend use low cost sensors for autonomous robot. Robot Operating Software allows a user to integrate existing sensors such as motor encoders, MPU, Vision and after data acquisition from sensors, SLAM algorithm creates map and provides navigation plan. Obstacle avoidance and odometry is performed by Arduino Mega. There are many vision sensors for mapping. One such sensor for mapping is LIDAR also known as Light detection and ranging. Due to its continuous 360 rotation, it acquires laser data. Similarly, Kinect has its own range sensor, which can also mimic LIDAR, but it has to stop monetarily and rotate to collect data. Due to limited computational power, Extended Kalman Filter SLAM is used which increases accuracy for navigation by fusing odometry from encoder and accelerometer in x and y direction.

Benefits of the Project

Fourth industrial revolution is a rapid transformation in design, manufacturing, work and maintenance of the manufacturing systems, which causes a sudden jump in productivity and change of human life in whole world. Fourth industrial revolution is the successor of three earlier industrial revolutions. Autonomous robots plays an important role in benefiting both businesses and organisation for their effectiveness.  while the Human workforce is tasked for higher cognition, autonomous robots perform repetitive and tedious tasks such as warehouse optimization, border patrol, site surveillance, medical service, courier and other tasks requiring daily routine. in addition, autonomous robots can also operate in hazardous environments such as radioactive sites, sewage pipe, fire fighting and tasks which are not suitable for humans or beyond our reach. slam is the heart of adance drier assisted systems (ADAS), a technology that enables a car to navigate traffic and reach its destination without human intervention. 

Technical Details of Final Deliverable

Map is generated from Kinect sensor and plotted in ROS Indigo installed on the laptop. Command inputs from point A to point B will be given upon which robot will send navigation coordinates to Raspberry Pi. EKF-SLAM package installed on the Pi will send an estimate position to Arduino Mega. The two DC motor will now receive the desired heading and distance to cover from Arduino. For more accuracy, MPU 6050 accelerometer data fused with Encoder readings.

 

Map is generated from Kinect sensor and plotted in ROS Indigo installed on the laptop. Command inputs from point A to point B will be given upon which robot will send navigation coordinates to Raspberry Pi. EKF-SLAM package installed on the Pi will send an estimate position to Arduino Mega. The two DC motor will now receive the desired heading and distance to cover from Arduino. For more accuracy, MPU 6050 accelerometer data fused with Encoder readings.

 Final Deliverable of the Project Hardware SystemType of Industry Transportation Technologies RoboticsSustainable Development Goals Decent Work and Economic Growth, Industry, Innovation and InfrastructureRequired Resources
Elapsed time in (days or weeks or month or quarter) since start of the project Milestone Deliverable
Month 1Literature reviewFundamentals of SLAM
Month 2Components survey Purchase of components
Month 3CAD model and prototype designSolidworks model and complete hardware structure
Month 4Calibration of sensors and Arduino interface. Tune PID of motors.localization without vision sensor, ability to navigate indoors
Month 5Fusion of vision and odometry. install ROS and implement SLAMmapping and autonomous naigation

More Posts