Autonomous vehicle for grapeyard
The goal of the project is to autonomously navigate and spray insecticides/pesticides in a constrained environment economically and is intended to be used by local farmers as a cheap alternative to other expensive spraying methods. In this case, our constraint environment is a grape yard, and the pl
2025-06-28 16:30:34 - Adil Khan
Autonomous vehicle for grapeyard
Project Area of Specialization Mechatronics EngineeringProject SummaryThe goal of the project is to autonomously navigate and spray insecticides/pesticides in a constrained environment economically and is intended to be used by local farmers as a cheap alternative to other expensive spraying methods. In this case, our constraint environment is a grape yard, and the plan is to use a simple and robust visual teach-and-replay navigation system where a vehicle is able to autonomously repeat paths previously taught by a human operator. The mapping and navigation are expected to be done by a single camera as the algorithm is centered around monocular vision, furthermore automating a vehicle is also part of the deliverables, that will be done on an All-Terrain-Vehicle by creating a mechanism for steering control and remotely controlling it.
Project ObjectivesAutonomous robots possess the potential to improve on-farm productivity, and field management activities. Different robots can be used to facilitate cultivation, weeding, and other farming operations. For accomplishing such tasks, robots must be able to autonomously navigate through the crop repeatedly.
“The goal of the project is to autonomously navigate and spray insecticides/pesticides in a constraint environment economically”
The possible solutions are discussed in the order of descending cost:
- A popular solution for autonomous navigation is to use a high-precision dual-frequency RTK GNSS receiver to act as a guide for pre-programmed robot paths. The high cost of these systems and outages has led us to solutions based on observations from on-board cheaper sensors. Other more feasible solutions will typically use readings from a laser scanner or a camera to navigate and localize the robot, with help of maps
- Use of monocular vision and IMU to navigate using bearing only navigation in which the sensor cost and computational complexity are kept minimal.
Choice of Algorithm
Bearnav is an open-source teach and repeat visual navigation system based on ROS and C++. Its basic method is computationally efficient, it does not need camera calibrations and can learn and traverse autonomously through arbitrarily shaped paths. During the teaching phase, when the robot is driven by an operator, the robot learns its velocities and image features visible through its onboard camera. While navigating autonomously, the method avoids explicit robot localization in the 3D or 2D space, but it explicitly replays the velocities that it stored during the teaching phase, while also correcting its headings relatively with the path based on its camera data.
Choice of Platform
The choice of platforms is:
- A remote-controlled ATV
- ATV Quad bike
An ATV is a vehicle that is mounted with low-pressure tires, a seat for the operator, and with handlebars for steering control. It is designed in such a manner as to handle a wider variety of terrains than most other vehicles in similar classes. It is ideal for uneven terrain operations such as in our case a vineyard due to the extra wheels giving more stability at slower speeds.
For the testing phase of the project, A remote-controlled ATV is used for the data collection once results are decent and a working platform is established. The setup will be moved to an ATV Quad bike. The ATV Quad bike controls such as steering would then be converted to automated controls using linear actuators
Benefits of the Project-Economical solution.
-No poverty.
-Higher productivity output of grapes per acre of land.
-Accessibility to non-human working conditions.
-Fully Autonomous system.
Technical Details of Final DeliverableWell, our hypothesis is an economical solution. We could say that these methods are accurate, but the overall cost is not something that can be marketed because of the platforms discussed. The table shows the uniqueness of our proposed system and how our approach is economical.
Table of Comparison with related work
| Project | GRAPE | Row-crop Navigator | UAV Quadcopter | Proposed System |
| Sensor Setup | 2 cameras, 2D and 3D laser sensor, GPS and IMU | 2 cameras, IMU | 2 cameras, Quadcopter gyroscope | 1 camera, IMU |
| Experimental Platform | Quad Robot | Quad Robot | Quadcopter | Quad Robot |
| Algorithm | ROAMFREE based on SLAM | Image-based visual servoing controller | FPGA-SURF which is teach and repeat | Based on BearNav |
The motivation is to create a system which would be able to navigate a mobile robot in an free environment of any size with great reliability. For this the navigation system should have the following properties
- simplicity - complex systems are more likely to contain errors hence the method should be as simple as possible
- swiftness – real-time constraints should be satisfied,
- standardless - it should use available equipment, making it easy to use and apply,
- stability - the position uncertainty should converge with time (convergence is stability)
- scalability - the environment size should not affect its computational complexity
The researcher, Tomas Krajnik verifies the “teach and repeat” method on a mobile robot: P3AT robot with the Unibrain Fire-i601c camera, the TCM2 compass, and the HP 8710p laptop. Specification of the laptop is Core2 Duo CPU running at 2.00GHz and 1 GB of memory. To provide for heavy computation of image processing additional UPC70 battery was used. In the case of a software platform standalone Linux application coded in C/C++ is used.
Project
Sensor Setup
Experimental Platform
Algorithm
Final Deliverable of the Project HW/SW integrated systemCore Industry AgricultureOther Industries Food , Health Core Technology RoboticsOther Technologies Artificial Intelligence(AI)Sustainable Development Goals No Poverty, Zero Hunger, Good Health and Well-Being for PeopleRequired Resources| Elapsed time in (days or weeks or month or quarter) since start of the project | Milestone | Deliverable |
|---|---|---|
| Month 1 | System setup | Setting Linux, ROS and Jetson Nano. |
| Month 2 | Setting up bearnav | Feature extraction algorithm Functioning Mapper node on ROS . Functioning Navigator node on ROS. |
| Month 3 | Literature review | Literature review |
| Month 4 | Making the vehicle autonomous | Hardware setup. Designing steering control |
| Month 5 | Implementation of Algorithm | Functioning drive control node on ROS |
| Month 6 | Data collection | Trials on real time grape yard |
| Month 7 | Design and implementation of sprayer | Spraying Mechanism |
| Month 8 | Integration of algorithm and complete hardware setup | Complete setup |
| Month 9 | Final report | Final Report |