The multidisciplinary research team believes that by utilizing swarm robotics to reduce soil compaction and working to avoid herbicide-resistant weeds through nonchemical methods of control, significant ecological and environmental benefits can be achieved.

Adaptive Swarm Robotics Could Revolutionize Smart Agriculture
Adaptive Swarm Robotics Could Revolutionize Smart Agriculture

Steve Kuhlmann | Texas A&M University Engineering

The use of adaptive swarm robotics has the potential to provide significant environmental and economic benefits to smart agriculture efforts globally through the implementation of autonomous ground and aerial technologies. 

"Agricultural robots, when used properly, can improve product quantity and quality while lowering the cost," said Dr. Kiju Lee, associate professor and Charlotte & Walter Buchanan Faculty Fellow in the Department of Engineering Technology and Industrial Distribution and the J. Mike Walker '66 Department of Mechanical Engineering at Texas A&M University. 

The project is led jointly by Lee, Dr. Muthukumar Bagavathiannan in the Texas A&M Department of Soil and Crop Sciences and Dr. Juan Landivar in the AgriLife Research and Extension Center at Texas A&M University-Corpus Christi. The research has been recently funded by the United States Department of Agriculture National Institute of Food and Agriculture through the National Robotics Initiative 3.0 program. 

The entire multidisciplinary group — comprised of members from several Texas A&M University System departments, institutions and agencies — is working to establish a configurable, adaptive and scalable swarm (CASS) system consisting of unmanned ground and aerial robots designed to assist in collaborative smart agriculture tasks. 

"We will develop the technical and theoretical groundwork for the deployable, scalable swarm system consisting of a physical robotic swarm, of both ground and aerial robots, a digital twin simulator for low- and high-fidelity simulations, and an easy-to-use user interface for farmers to make this CASS system into use," Lee said. 

This approach to smart agriculture, enabled by the CASS technology, could result in long-term benefits thanks to reduced waste through better logistics, optimal use of water and fertilizer, and an overall reduction in the use of pesticides. 

The research team believes that by utilizing smaller machines to reduce soil compaction and working to avoid herbicide-resistant weeds through nonchemical methods of control, significant ecological and environmental benefits can be achieved. 

Recent trends in smart agriculture focused on the usage of large machinery have had the objective of maximizing product quantity and minimizing costs — an approach that has resulted in some economic and environmental concerns. Lee said issues including soil compaction, a limited ability to address small-scale field variability and reduced crop productivity are some of the long-term issues that have emerged from this approach. 

By leveraging the flexibility of swarm robotics, the CASS system is intended to become a platform technology that can be configured to meet application-specific needs. 

"Current trends in precision agriculture and smart farming mostly focus on larger machinery or a single or a small number of robots equipped and programmed to perform highly specialized tasks," Lee said. "This project will serve as a critical pathway toward our long-term goal of establishing a deployable easy-to-use swarm robotic system that can serve as a universal platform for broad agriculture applications."

Although other systems employing swarm robotics exist, they are typically designed to perform just one specific task rather than being adaptable to a variety of situations. 

Moving forward, the team will have the opportunity to address several challenges related to the complex and varying scale of agriculture applications through the design and implementation process of their system. 

"Despite the great potential, swarm robotics research itself has been largely confined to low-fidelity simulations and laboratory experiments," Lee said. "These rarely represent the intricacies of an agricultural field environment. Also, human-swarm collaboration has not been extensively explored, and user-in-the-loop development and evaluation approaches are needed in particular for the target end-users — in our case, farmers."

Other investigators on the team include Dr. John Cason in Texas A&M AgriLife Research, Dr. Robert Hardin in the Department of Biological and Agricultural Engineering, Dr. Luis Tedeschi in the Department of Animal Science and Texas A&M AgriLife Research, Dr. Dugan Um in the Texas A&M-Corpus Christi Department of Mechanical Engineering and Dr. Mahendra Bhandari in Texas A&M AgriLife Research. 

 

The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

3D Vision: Ensenso B now also available as a mono version!

3D Vision: Ensenso B now also available as a mono version!

This compact 3D camera series combines a very short working distance, a large field of view and a high depth of field - perfect for bin picking applications. With its ability to capture multiple objects over a large area, it can help robots empty containers more efficiently. Now available from IDS Imaging Development Systems. In the color version of the Ensenso B, the stereo system is equipped with two RGB image sensors. This saves additional sensors and reduces installation space and hardware costs. Now, you can also choose your model to be equipped with two 5 MP mono sensors, achieving impressively high spatial precision. With enhanced sharpness and accuracy, you can tackle applications where absolute precision is essential. The great strength of the Ensenso B lies in the very precise detection of objects at close range. It offers a wide field of view and an impressively high depth of field. This means that the area in which an object is in focus is unusually large. At a distance of 30 centimetres between the camera and the object, the Z-accuracy is approx. 0.1 millimetres. The maximum working distance is 2 meters. This 3D camera series complies with protection class IP65/67 and is ideal for use in industrial environments.