Over the last decade, work on Unmanned Aerial Vehicles (UAVs) and particularly Quadcopters has captured the attention of both academia and industry because of their mechanical and control simplicity, high manoeuvrability and low cost of entry point. These features of UAVs have made them popular aeri
Autonomous Navigation of Customized ROS based Quadcopter using ORB SLAM with Nvidia Jetson Nano and Stereo Image Processing or LiDAR Waypoint Tracking
Over the last decade, work on Unmanned Aerial Vehicles (UAVs) and particularly Quadcopters has captured the attention of both academia and industry because of their mechanical and control simplicity, high manoeuvrability and low cost of entry point. These features of UAVs have made them popular aerial robots in a large number of Robotics applications such as Defence, Transportation (package delivery), Entertainment (cinematics) and surveillance.
However, safe navigation in unknown environments is a challenging task for most autonomous UAV systems. Most previous works generally avoid obstacles and perform navigation assuming that the environment is static and the position of the Quadcopter is known in a previously defined map. In recent years, we have seen the development of various vision-based autonomous UAVs that can perform autonomously in unknown environments - which we want to achieve in our project.
The main challenge of vision-based autonomous navigation is the UAV is required to track a set of waypoints and avoid collisions with only a set of resources with limited capability (the onboard sensors and companion computer) during the flight. To achieve this task, we generally take the UAV's odometry, estimated with the onboard visual and inertial information, as the feedback for waypoint tracking. We also require visual or ranging data (such as Depth Image, Light Detection and Ranging or LiDAR measurements, etc) to perceive the local environment. In order to avoid collisions with obstacles, both the estimated states of the UAV and obstacles are required to generate collision-free trajectories for safe, autonomous flight.
The main purpose of our Project is to develop a customized, ROS-based Quadcopter model to suit our requirements and durability standards so that it can be used for safe, autonomous navigation in unknown static environments. We will present an on-board vision-based approach for the avoidance of obstacles in such environments using visual odometry algorithms such as ORB SLAM. LiDAR Waypoint Tracking or Stereo Image Processing techniques shall be used to perceive the environment and provide estimates of the position and size of obstacles. Several Artificial Intelligence (Deep Learning) techniques can be utilized to enhance the accuracy and performance of the Image Processing and autonomous navigation techniques. This requires fast processing and computing capabilities that can only be provided by a GPU or Graphical Processor Unit such as Nvidia's Jetson Nano. Jetson Nano will also enable us to use ROS framework for controlling, communicating and interacting with our Autonomous Quadcopter.
The Main Objectives of our Project are as follows:
1. Assemble and develop a suitable customized Quadcopter model capable of Autonomous Navigation and Hardware testing.
2. Be able to control the movement of our Quadcopter both manually and autonomously.
3. Use ROS framework for interfacing all on-board Sensors, Flight Controller and on-board GPU together and communicate flight data and information remotely on a Base Station (PC) via WiFi.
4. Develop an efficient Position Tracking System (using either LiDAR or Stereo Camera) for implementation of visual SLAM (ORB SLAM).
5. Study and implement Simultaneous Localization and Mapping (SLAM) and make Quadcopter capable of navigating autonomously in unknown static environments.
6. Perform SITL (Software In The Loop) Simulation in order to test Obstacle Avoidance and vision-based Navigation Algorithms.
7. Enhance performance by implementing advanced AI/Deep Learning-based Image Processing Techniques on Nvidia Jetson Nano.
Additional Objectives which we may work towards once main objectives have been fulfilled:
1. Perform HITL (Hardware In The Loop) Simulation on real Flight Controller Board.
2. Explore the possibility of expanding our Project Scope to incorporate defence or military-based applications.
3. Explore other functionalities of Autonomous Drones (such as Positioning using AruCo Markers, QR codes or RFID tags, etc)
4. Test our Quadcopter with different payloads.
The first stage of our Project involves the actual assembly and setup of the basic Quadcopter hardware required for an autonomous UAV. Once the hardware assembly is completed, we will have to install the PX4 Firmware into our Pixhawk Autopilot Flight Controller. The firmware setup and sensor calibration can be performed using Ground Control Station (GCS) Software called QGround Control. Once these steps are completed, we can prepare our Quadcopter for Manual Flight via Radio Telemetric Control.
For the next stage, we will introduce our Companion Computer/GPU - Nvidia's Jetson Nano - into our Quadcopter Model. We will install Ubuntu 18.0 Operating System along with ROS Melodic firmware. We will then interface our Flight Controller (Pixhawk Autopilot) with Jetson Nano via MAVROS (a ROS Package that will enable communication with Jetson Nano and Pixhawk Autopilot via MAVLink Protocol for communication). Once interfacing is successful, we will be able to receive all the information from our Pixhawk Flight Controller including all the sensor data. We will also install Ubuntu 18.0 and ROS Melodic into our PC which will act as our Base Station for accessing the Flight Data received from our Flight Controller. Our Base Station will communicate with our on-board Computer, Jetson Nano, over ROS via WiFi network.
After this, we will try to perform SITL Simulation using the Gazebo Simulator to test basic code and programming for first Autonomous Test Flight. We will find a suitable Software model similar to our Drone which we can use for our simulations. This would prove more efficient than designing our own model from scratch since it is quite a tedious and time-consuming task. We will also have to create/find a suitable environment in which we can test our Quadcopter Model. Once our code for basic Autonomous Test Flight begins to work without any obvious errors, we shall test the code on our actual hardware and analyse its performance.
Once completed, we shall work on implementing ORB SLAM. Before going forward with this, we will have to decide on a suitable Position Tracking System using either LiDAR or Stereo Cameras. We will choose the option that proves to be most efficient in terms of performance and low cost. We will setup our Computer Vision sensors on to our Quadcopter Model and interface them with Jetson Nano. Following this, we will need to find suitable Image Processing Algorithms to pre-process the LiDAR/camera outputs before they can be used further for navigation. We will then move to implementation of ORB SLAM by installing the required packages and study how this technique is used for tracking, mapping, relocalization and loop closing. We will first attempt this task on simulation. Once successful, we will test it on our hardware system.
Our Project can provide a number of benefits to us as students and to our country which are explained as follows:
Benefits to us as Students:
Benefits to our Country:
General Benefits of Autonomous Quadcopters and other UAVs:
Our Final Project Deliverable will be a Hardware-cum-Software system along with a complete Manual/Guide explaining how to operate the System. We will try to make a working Quadcopter than be configured to work both manually (via Radio Transmitter and Receiver) and autonomously (by running a Python script). We will monitor the Quadcopter and obtain the sensors data on our Base Station (PC). We will also try to find a suitable arena where we can test our Quadcopters navigation capability by implementing SLAM.
As for the Software, we will present a working simulation of our Quadcopter while performing autonomous navigation in various environments. The Simulation will be carried out over Gazebo.
We will also provide a Manual/Guide book (in softcopy or hardcopy format) with clear instructions on how to setup, monitor and operate the Quadcopter.
| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| ZED 2 Stereo Camera | Equipment | 1 | 70000 | 70000 |
| Shipping Charges | Miscellaneous | 1 | 10000 | 10000 |
| Total in (Rs) | 80000 |
There is a significant amount of energy is consumed in the university buildings. This cons...
Fruits, vegetables and spices are the rich source of proteins, minerals and vitamins. Many...
Inside a supply chain, each node of the network faces a trust issue, that is, an uncertain...
PLC applications are widely used in industries to facilitate and control repetitive proces...
The project Automated irrigation system implies that automatic watering system to the fiel...