Vision Based Cotton Picking Robot

Cotton crop harvesting is one of the most laborious job in agricultural sector of Pakistan. Tough weather conditions, and increased laborers wages are some of the major issues faced by cotton cultivators. Addressing this issue, we propose a vision based cotton picking robot (VBCPR) that is capable o

2025-06-28 16:36:38 - Adil Khan

Project Title

Vision Based Cotton Picking Robot

Project Area of Specialization Electrical/Electronic EngineeringProject Summary

Cotton crop harvesting is one of the most laborious job in agricultural sector of Pakistan. Tough weather conditions, and increased laborers wages are some of the major issues faced by cotton cultivators. Addressing this issue, we propose a vision based cotton picking robot (VBCPR) that is capable of maneuvering in a cotton field, detecting cotton boll, calculating their coordinates, picking them up using a robotic arm, and storing them in an on –board storage compartment.

Image processing is composed of detection and localization of cotton bolls. We start by collecting images of cotton bolls from various angles, lighting conditions, shapes, sizes, and so on. This is done by both physical capturing the photos, and collecting them from the internet. Once our model has been trained to desired accuracy we move towards methods of ‘active image processing’ that calculates real time data, rather than ‘passive image processing’ that works on non-real-time images.

After successful detection, the localization part is considered. Here closed feed-back system will be employed. The camera (that is attached to end-effector of the robotic arm itself) starts by providing an initial position of the cotton bolls to the raspberry pi. The raspberry pi then calculates a difference of x, y and z coordinates between the current coordinates of the camera and the coordinates of the cotton boll. Based on this difference, it generates instructions for robotic arm motors. When this difference reaches zero, robotic arm’s tip reaches the cotton boll.

Once the robotic arm has reached the cotton boll, the next step is to turn the vacuum mechanism ‘on’ that is build into our robot. The vacuum pipe of this vacuum pump is attached along the length of robotic arm, and its nozzle is attached to the same place where our camera is attached, i.e. the end effector. Pressure difference created by the vacuum mechanism leads to suction of cotton boll into the nozzle of the pipe, and is transferred to the storage compartment inside the robot.

The movement of this VBCPR is controlled by instructions from a mobile application. We are using a Bluetooth module along with a pic microcontroller to receive serial data from the app, decode it, and then generate instructions to control motors that are attached to tyres of the robot. Hence making it capable of moving in forward, backward, left and right direction.

We are designing the complete hardware, as well as software for this project. This includes:

Project Objectives AIM:

Aims of this project are

OBJECTIVES Project Implementation Method
TASK REQUIREMENT
COMPLETE MECHANICAL DESIGN OF ROBOT BODY TO HOLD ALL COMPONENTS IN PLACE, AND GIVE SHAPE TO THE ROBOT
TYRE MOTOR INTEFACING WITH PIC MICROCONTROLLER TO BE ABLE TO MANUEVER ROBOT FREELY IN THE COTTON FIELD
BLUETOOTH MODULE (HC-05) INTERFACE WITH PIC AND MOBILE APP TO RECEIVE WIRELESS DATA FROM MOBILE APP, AND SEND IT SERIALLY TO PIC FOR CONTROLLING TYRE MOTORS
INTERFACING CAMERA WITH RASPBERRY PI CAMERA PROVIDES REAL TIME IMAGE DATA TO RAPSBERRY PI, THAT IS USED FOR DETECTION AND LOCALIZATION OF COTTON BOLLS
INTERFACING ROBOTIC ARM MOTORS WITH RASPBERRY PI TO ADJUST ANGLES OF ROBOTIC ARM MOTORS ACCORDING TO TO COORDINATES OF COTTON BOLLS AND REACH THEM
INTERFACING RASPBRRY PI WITH VACUUM MECHANISM THIS IS A SIMPLE CONNECTION OF RLEAY TO TURN ON OR OFF THE VACUUM MECHANISM
PYTHON CODING THIS INCLUDES CODE FOR IMAGE PROCESSING, ROBOTIC ARM, VACUUM MECHANISM AND THE INCLUSION OF ALL REQUIRED LIBRARIES IN RASPBERRY PI
C CODING THIS INCLUDES CODE FOR TYRE MOTOR INTERFACE, AND TO RECEIVE AND DECODE SERIAL DATA FROM BLUETOOTH MODULE IN PIC MICROCONTROLLER
TASK COMPLETE MECHANICAL DESIGN OF ROBOT BODYTYRE MOTOR INTEFACING WITH PIC MICROCONTROLLERBLUETOOTH MODULE (HC-05) INTERFACE WITH PIC AND MOBILE APPINTERFACING CAMERA WITH RASPBERRY PIINTERFACING ROBOTIC ARM MOTORS WITH RASPBERRY PIINTERFACING RASPBRRY PI WITH VACUUM MECHANISMPYTHON CODINGC CODINGBenefits of the Project

This project will result in a full product (i.e. vision based cotton picking robot) to be used in market.

Above mentioned points were the direct to be taken from this robot. But this project can also benefit us in other ways like:

Technical Details of Final Deliverable

This project comprises of 3 parts; cotton detection and location, robotic arm, wheel mechanism.

Cotton detection:

we are using computer vision for the detection of cotton bolls. We are coding it in python. We have used various commands of open cv to achieve our goal. This robot will be able to detect approximately all the cotton bolls in the image, but we have ignored small ones and taken the bolls that are in front of the field. We are collecting dataset for its shape detection.

Coordinate detection:

After detection, we found out the coordinates of the cotton boll. This will be done using feedback mechanisms. Distance of cotton boll from camera will be measured by using of the pixels. As camera go nearer to the cotton boll the size of pixels increases.

Robotic arm:

This robot will be having a 3 dof (degree of freedom) arm mechanism. It will be made of 4 motors and 3 joints. Our coordinate detector will give its coordinate to it and it will set its motor angles according to those coordinates. We have already made an inverse kinematic model that will convert the coordinate to the angles of motors.

Vacuum mechanism: after reaching the desired location, our controller will turn on our vacuum mechanism and it will suck the cotton boll from plant and store it in a bucket.

Moving mechanism: 

This robot has its own moving mechanism so, it can move inside a filed. But its not fully automatic, its movement controlled by phone using app. We have done this to keep the robot in a straight line. There could be other methods like using sensors to keep it in a straight line but as the cotton plants are not denser so it was creating problems. It all was making this project much complicated. So, for making it a little bit easier we have controlled it using app.

Above mentioned, all mechanisms are interconnected. All are depending upon each other. Like the base of robot is depending upon the size of all the components and width of the row in field. Motors and battery of moving mechanism is depending upon the total load (weight of the robot). We are using raspberry pi and pic microcontroller to control all the mechanisms of our robot.

Final Deliverable of the Project HW/SW integrated systemCore Industry AgricultureOther Industries Education , Manufacturing , Others , Telecommunication Core Technology RoboticsOther Technologies Artificial Intelligence(AI), Others, Big DataSustainable Development Goals Decent Work and Economic Growth, Industry, Innovation and InfrastructureRequired Resources
Elapsed time in (days or weeks or month or quarter) since start of the project Milestone Deliverable
Month 1Basic search on motors, microcontroller, camera, bluetooth module.Motor specifications for robotic arm and tyres selected and bluetooth module HC-05 chosen. 8 MP raspberry pi v2 Sony IMX219 camera selected
Month 2Estimating amount of processing power required and selection of microcontroller/sRapsberry pi 4 model selected for image processing and all other tasks except tyre movement. PIC used for tyres
Month 3Basic search on Robotic Arm models and algorithms.Understanding of both forward and inverse kinematics model for robotic arm.
Month 4Basic search on Robotic Arm and Image Processing models and algorithms.Implementation of inverse kinematics in Python. Basic search for image processing started
Month 5Dataset collection for Machine Learning model for cotton detectionImages collected at various angles, lighting conditions, shapes and sizes, as well as from online resources
Month 6Model Training with collected data, and testingModel trained with 80 percent of collected data. Tested with remaining 20 percent images. Satisfactory results
Month 7Detection on Basis of color and shapeDetection based on color was done first. Results were very good but not for real time. Then, shape detection method was employed. Difficult to implement, but we got better real time results with this
Month 8Search for method to control tyre movementInitially fully autonomous system was proposed. But due to uneven conditions of a physical field, this was not feasible. Mobile app alternative was chosen
Month 9Completion of robotic base, and complete implementation of movement mechanismThe mechanical structure for base was finalized. Motors, tyres, PIC microcontroller, and bluetooth module all working togather to move the robot forward, backward, left and right
Month 10Completing Mechanical structure of robotic arm, and interface with raspberry piThe mechanical structure (and motors for each joint) are finalized. The mathematical model of robotic arm (python code) interconnected with hardware through raspberry pi
Month 11Cotton Boll localization with image processing, robotic arm adjustmentsCoordinates of cotton boll calculated and fed to the robotic arm model to reach the calculated coordinates. Final testing of overall components also started
Month 12Final testing and documentationMinor adjustments with battery capacity selection, and buck boost convertors for individual components. Final report write up.

More Posts