Adil Khan 9 months ago
AdiKhanOfficial #FYP Ideas

Smart Robot Mapping and Path Planning using Augmented Reality

In this age of robotics and automation, revolutionary progress is observed by incorporating robotics in the industries. In advanced countries applications of this technology are being paired with other state of the art innovations like Augmented Reality (AR) and Virtual Reality (VR). Augmented

Project Title

Smart Robot Mapping and Path Planning using Augmented Reality

Project Area of Specialization

Augmented and Virtual Reality

Project Summary

In this age of robotics and automation, revolutionary progress is observed by incorporating robotics in the industries. In advanced countries applications of this technology are being paired with other state of the art innovations like Augmented Reality (AR) and Virtual Reality (VR). Augmented Reality, or AR, is the enhancement of real-world environments through computer-generated perceptual information. AR is used to turn the physical object into a smart object that communicates and interacts with a user. Human–robot interaction in industrial robotics has largely been confined to finding better ways to re-configure or program the robots. The first major application of AR in the industry is the Boeing’s AR-assisted wire bundle assembly

Pakistan being a developing country, is striving hard to cope up with other developed countries. Augmented Reality in Pakistan is however a new concept but it sure is picking up as major telecom, and media hubs have started to introduce it to their consumers in entertaining and user-friendly ways.

Our project aims to provide the user fully ease in controlling robots through augmented reality and provide production and process related information as well as to enhance the operators’ immersion in the safety mechanisms, dictated by the collaborative workspace. Moreover, the technique of SLAM (Simultaneous Localization and Mapping) when incorporated with AR can be used in navigation of the robot. This technology is also known as V-SLAM i.e. Visual-SLAM. We aim to commercialize our application to support modern needs of Industry 4.0

Project Objectives

The primary objective of our project is to create an immersive interaction of robot and operator through Augmented Reality. Different pre-programmed robotic tasks will be executed by interacting with the robot through hand-held tablet device or the AR supported glasses.

The second objective of the project is the flexibility and ease of robotic control and programming for anyone without the need of knowing programming languages or having extensive skills in robotics. The operator will be able to develop a specific path planning of robot by interacting with the end-effector or user interface elements without any physical link to the robot.

The third objective of our project is to implement V-SLAM using AR and implement robot navigation inside the mapped environment. The AR device will map and localize itself inside the mapped environment. It will then scan for the robot location inside the environment. The operator will bring AR device anywhere inside the environment and will instruct the robot to navigate there. The AR device will decide an obstacle free optimized path for the robot to move from its current location to the final goal.

Project Implementation Method

Implementation of our project involves four major parts;

  • Physical Robot
  • AR compatible device
  • Interface between robot and AR using Unity.
  • V-SLAM

Detailed description has been provided below.

  1. We will be using a four wheeled car structured robot, having sensors all around to detect the right path and avoid obstacles.
  2. AR compatible device (preferably AR Glasses) will be used to interact with the robot. The same will be used to map the environment and localize inside that.
  1. For the interface between robot and AR device we will be using Unity® Game Engine platform. This will be accomplished is these steps.
  1. 3D Model of the robot will be created in Solid Works® to match the exact physical dimensions.
  2. 3D model created in Solid Works® will be imported to Unity.
  3. AR glasses will be interfaced with Unity incorporating all sensors.
  4. The robot model will be imposed on the physical robot by scanning its features.
  5. The robot will be interacted through simple touch on the AR device or sensing hand movement in case of AR glasses.
  6. The inverse kinematics of the robot will be calculated in Unity and will be transferred to the physical robot which will follow the same movement.
  7. The path planned for a particular task will be stored and could be played multiple times in future.

In case of V-SLAM, the AR device will scan the environment through depth sensing and plane detection in Unity. Once complete map is scanned and localized, the AR device will tag any location of the environment by using Ray-Casting techniques of Unity. The trajectory from current location to the tagged location will be calculated in Unity and will be transferred to robot to follow and reach the goal.

Benefits of the Project

  1. Ease and flexibility to program a robot is the major benefit of our project. Robotic manipulators are being used in industry for last few years. Once they are programmed for a specific task, they will continue the same operation but re-programming the robot for a new updated task is not an easy job. Specialized experts in the field of robotics and robot-programming are hired for these updates. Using our proposed project, any operator will be able to interact with the robot in much easy way and reconfiguration of the task will be much easier and time saving.
  2. Safe human-robot collaboration is vital important in robotic applications in industrial domain. Our project brings this feature to our industry as there is no physical interaction between robot and human. This feature can be further enhanced by incorporating more sensors and intelligence in the robotic systems.
  3. Our project provides an easy way to test complex dynamic scenarios before its implementation on the real system. This will help to reduce the testing time. Moreover, this system can be used for the training of the staff to operate the actual system.

Our project also has benefits in classroom and lab teaching of robotics. Teaching robotics is a challenge in many universities due to the mathematical concepts and visualization involved in it. With the aid of 3D augmented content, visualization and understanding of the complex robotics physics can be improved to a large extent. For example, the fundamental concept of Forward and Inverse Kinematics in robotics can be made realistic to understand when it is visualized as 3D model in real world.

Technical Details of Final Deliverable

The final deliverable of our project is fully featured smart application to interact with Robot via Augmented Reality. The detail of features is listed here:

  • Path planning of Robotic Manipulator

The application will be able to identify the target robot in the workspace. Interactable User-Interface (UI) elements will be augmented on that physical robot. The operator will interact with UI elements to move the links and physical robot will respond to that. The application will have ability to store the planned path and repeat afterwards. This deliverable has two modes:

  1. Real-Time/Online Programming:

In this mode, the user will interact with the robot through AR device and robot will respond in real time. There will be multiple UI options for the interactions. One simple option is slider bars on each link that can be used to rotate that link as required. Then there will be simple buttons on the end effector to open or close it. There will be a button to move back the manipulator to its default home position.

  1. Off-line Programming:

In this mode the user will interact with the robot same way as in the previous mode but here the real-world robot will not follow that in real-time. Instead the planned actions will be saved and user can test that by running it again as many times as required. Once finalized, the user will move that program to actual robot and robot will perform the same action. Another option will be to generate the program of the robot in robot-specific format for that action. That code file then can be loaded to the robot and it will perform that task.

  • V-SLAM

A mesh of our environment will be created so not only the machine can tell you where the floor is, but it can also identify walls and objects in the environment allowing everything around you to be an element to interact with. The application will identify the robot inside the environment and that could be navigated inside the environment through AR device.

Final Deliverable of the Project

HW/SW integrated system

Type of Industry

Education , IT , Others

Technologies

Augmented & Virtual Reality

Sustainable Development Goals

Decent Work and Economic Growth, Industry, Innovation and Infrastructure

Required Resources

Item Name Type No. of Units Per Unit Cost (in Rs) Total (in Rs)
AR glasses Equipment17000070000
tool set Miscellaneous 150005000
stationery & printing Miscellaneous 110001000
shipping charges Miscellaneous 140004000
Total in (Rs) 80000
If you need this project, please contact me on contact@adikhanofficial.com
0
100
Exams Management System

The Examination Management System was developed for the educational institute to simp...

1675638330.png
Adil Khan
9 months ago
Seero Tafriah

Seero Tafriah will provide travelers with an all-in-one travel solution, allowing them to...

1675638330.png
Adil Khan
9 months ago
Online Admission System

Online admion system project  Its for school and any University or college Project...

1675638330.png
Adil Khan
9 months ago
Healthtech App

The Healthcare industry is one of the oldest industries and the basic administrative struc...

1675638330.png
Adil Khan
9 months ago
A Location Based Community Volunteering Online Platform For Society De...

In Pakistan, Medical and rescue emergencies have always been caused the catastrophic loss...

1675638330.png
Adil Khan
9 months ago