With respect to safe positioning, Tanner Whitmire (Hexagon I NovAtel) shared how positioning accuracy and integrity impact autonomous solutions, while Joaquin Reyes González (European GNSS Agency) explained why global navigation satellite systems (GNSS) are the preferred positioning option for autonomous robots.
On the topic of image analysis, Hajar Moussanif (Cadi Ayyad University) discussed how deep learning modules can improve how robots “see” and analyze various objects, such as crops and weeds. Markus Höferlin (Farming Revolution) continued the discussion by explaining how deep neural networks can be used to train robots to more accurately identify objects and conditions, and thus, improve their overall performances. Barney Debnam (Microsoft) brought the focus back to the overarching question by offering a framework for understanding whether today’s technologies are reliable enough for full autonomy in agriculture.
The Importance of Accurate Safe Positioning
In agriculture, positioning systems have been around for quite some time. Today’s systems are mostly used for small-scale tasks. The advent of agricultural robots and driverless tractors, however, will require technology providers to think a little bigger.
“Positioning systems are built for the primary purpose of creating applied maps, such as seeding, planting, and yield maps, and for guiding a machine left or right,” says Tanner Whitmire, sales and business development manager in agriculture at Hexagon | NovAtel. “Over the last several years, the focus has really been geared towards the accuracy of the positioning system. We initially started out with accuracy within several meters and have been able to take it down to decimeter-level accuracy or centimeter-level accuracy.”
As the industry transitions to fully autonomous solutions, Whitmire continues, the simple algorithms and hardware used now will need to be upgraded. Position integrity and safe positioning are becoming increasingly important. This requires technology suppliers to decrease the number of accepted errors. Currently, there are three positioning errors related to agricultural applications: positioning error, cross-track error, and along-track error.
Positioning error, Whitmire explains, is one that occurs in longitude and latitude. The next error is cross-track error, where the machine is off to the left or right when attempting to drive down a path. A small cross-track error, Whitmire says, will represent that the machine is close to the midline, but the greater the error, the harder the farmer’s job becomes.
The third is the along-track error where the vehicle drives down the path but is too far ahead or behind where it should be. All three of these errors are in relation to horizontal accuracy, but Whitmire believes the future will require technology companies to consider vertical accuracy and other variables, too.
“Position agriculture today is really been heavy focus on the accuracy of the position,” he says, “but as we begin transitioning to the fully autonomous solutions, we will need to expand our focus to include the position integrity.”
The difference between the two can be a little confusing, but Whitmire describes it in terms of confidence level. In the task position (so-called accuracy), “the accuracy is represented at a 95% confidence level.” Meanwhile, in the assured or safe position (integrity), there is confidence at the 99.999th percentile.
“The task position represents a more precise accurate position, as it is accounting for fewer errors,” Whitmire says. “It is a simpler software solution that allows us to maintain our level of accuracy. As we expand the integrity of that position, we have to account for more errors, which is going to stimulate the challenge of having to advance our software.”
In order to move toward a safer, smarter and more advanced system, companies that focus on positioning need to include sensors, GNSS, and other technologies to help improve accuracy.
“And by doing by adding these types of technology and these features, it's going to allow us to increase our confidence to 99.999 percent,” Whitmire says. “This is essential as we transition to the fully autonomous solution and take the driver out of the cab. We need to make sure that we are 99.999 percent confidence that that vehicle will not run into anyone or anything.”
The Importance of Signal Reliability
As the robotics manufacturers and technology companies focus on improving the positioning systems from a hardware and software perspective, many others are focused on the satellite navigation services that help make accurate positioning possible. One such organization is the European GNSS Agency (GSA).
“We are the ones operating and looking for applications for these satellites,” says Joaquin Reyes González, marketing development technology officer for GSA. “What we are putting online is the Galileo signal.”
Galileo is Europe’s GNSS, but is only one of the solutions GSA has developed to ensure safe positioning. The agency also has the European Geostationary Navigation Overlay Service (EGNOS), which helps to improve the accuracy of GNSS, and Copernicus, the European Union's Earth observation program, which is focused on “information services that draw from satellite Earth Observation and in-situ (non-space) data.
“What is clear is that this satellite technology and all this information that we are receiving is giving us a new generation of farming,” González says. “The farmer is not alone.”
With Galileo—a system developed 20 years after traditional GPS—users benefit from a better experience. The first thing that sets this system apart is that it’s multi-frequency. Users have access to open-service multi-frequency, where messages are delivered using several channels to improve accuracy.
“Galileo satellites are transmitting the message, not only using one channel, but using several channels,” González says. “This results in better performance and a better position for the user because the receiver can receive exactly the same information coming from different channels.”
Galileo also offers open-service navigation message authentication (ensuring the satellite is coming to the right place), signal authentication service (encryption ensuring the signal is coming from the right place), and high accuracy service (improving accuracy and precision). Combined, these features produce a highly accurate and precise system that users can trust to support safe positioning.
“If I have one message today, it is for users to consider using GNSS and not only GPS,” González says. “This is the most important thing.”
The Importance of Deep Learning
In addition to safe positioning systems that help agricultural robots navigate in complex environments, image analysis is equally important for enabling these powerful machines to perform their jobs effectively. Eliminating or streamlining tasks is certainly an important part of democratizing farm work. Hajar Mousannif, an accomplished Cadi Ayyad University associate professor, who founded the Master program of Data Science and is the Golden Winner of WomenTech Global AI Inclusion Award 2020, is thinking even bigger.
“I gave a TEDx talk where I said I was haunted by a dream a big dream to change this world in some way or the other,” she says. “I’ve always been confident that technology and artificial intelligence have the power to make people happier. In the five years since my talk, the world has indeed changed, but people haven't gotten any happier.”
COVID-19 has caused devastating impacts around the world. In Morocco, Mousannif’s home country, unemployment, social disparities, and poverty are on the rise. This impacted her outlook.
“It has taught me that making people happier can be achieved through creating cool tech, but it’s more about guaranteeing a decent living and ensuring people's basic needs.”
Agricultural robots can make a huge difference. According to Mousannif, who is leading a project with U.S.-based company FotaHub, Inc. to expand the capabilities of Shama, the first Moroccan-made humanoid robot, they already have.
“With the massive adoption of technology in the agricultural sector, we're not only allowing agriculture businesses to be more profitable, more sustainable, safer, and more environmentally friendly, but it will also ensure that farmers in rural areas make a decent living and get an education, instead of spending several hours per day doing highly repetitive and physically taxing labor that robots can efficiently do,” she says.
“In Morocco, we have succeeded in automating many tasks related to agriculture labor. Thanks to computer vision, many machines are now capable of identifying and sorting different fruits and vegetables according to their size and degree of maturity.”
This capability is the result of deep learning, an artificial intelligence function that helps computers and robots mimic the human brain. Specifically, deep learning imitates human intelligence, enabling machines to process information, reason, adapt to the environment, and solve more complex problems. It can be used in agriculture to help robots process images of plants and help detect diseased crops. To be successful, these deep learning modules need to be trained with data.
“We need to feed the algorithm with images that have the type and location of defects or diseases and let the model learn how to locate them after it has been trained,” Mousannif says. “It's worth mentioning that defects, whether it is a disease or a pest, can be challenging to detect even by human inspectors. It is also very time-consuming, so anything that is improving automation, increasing efficiency, and maintaining high quality in production is more than welcome.”
When attempting to train the deep learning modules, environmental factors will create challenges. The algorithms must be able to account for things like sunny and cloudy conditions or plant shadows. Images with insufficient light or improper exposure impact accuracy. Depending on the application, other image processing techniques can be used. An image can use sharpness to remove blur from some images, for example. Depending on the context, things like blur can be detrimental to a machine’s accuracy.
There are some technologies that can help. Convolutional Neural Networks (CNN) are trained to classify images in the same way than humans do. Transfer learning is a machine learning method that solves the problem of not having enough or any labeled images in the data set. Image segmentation can be used to classify, detect and segment anomalies. In the agricultural world, this means determining weeds and diseased plants among healthy plants. There are many people contributing to this important work, but the journey to super accurate autonomous technologies is a long one. Mousannif remains focused on the end goal.
“By actively contributing to the AI field, we will not only ensure a better future for us, but also for other generations to come,” she says. “We need to put our efforts toward building tech that really matters, that solves real problems, that has a direct impact on people’s lives and, most importantly, that ensures inclusion and preserves our humanity.”
The Importance of Improving Computer Vision
Many of the most important farming tasks are relatively undesirable for humans, which is why the mass adoption of agricultural robots, as Mousannif says, will impact people’s lives in a big way. With its AI-powered robots, the “weeding as a service” supplier Farming Revolution (formerly Deepfield Robotics) is focused on a particularly tedious and often-dangerous job: weeding.
The robots drive autonomously through the field, using multi-spectral cameras to take images of what they encounter. Then, using artificial intelligence and deep neural networks, the robot identifies the crops and weeds. The weeds closest to the crops are then effectively removed without chemicals. It sounds simple, but Markus Höferlin, head of artificial intelligence at Farming Revolution, says the end result is the product of an incredible amount of behind-the-scenes work.
“in the beginning, one of the main challenges was the computer vision part,” he says. “So, why is this so challenging? If we look at some industry plants where we also have a lot of computer vision tasks, we always see that we must constrain our environment to be able to reduce the complexity of it.”
A farm field, however, is completely unconstrained. There are different soil types and crops with leaves, and some of the crops overlap with weeds and with one another. The plants also tend to look different in the morning than they do at night and from one day to the next. An ever-changing outdoor environment further complicates the problem. Farming Revolution went to work.
“We are actually putting immense effort into capturing and labeling data in all the varieties that we can think of,” Höferlin says. “We captured data from more than 50 different fields. We started our capture campaigns at two o'clock in the night, where it's completely dark and the plants are sometimes still closed, and in the morning when the plants open slowly. We capture images into the afternoon and until the evening begins. So, we have all the different light conditions that we can think of.”
Farming Revolution also captured data in various weather conditions (dew, dust, mud) and seasons. The data was collected for five years. More than 65 different species were labeled with 99 percent per-pixel accuracy. The result was more than 12 million annotated images—a huge amount of training data. The artificial neural network would use this data to help the robot analyze and process the field images.
“This is what really makes a difference,” Höferlin says. “With this variety of data, we managed to go through the field, turn the robot on, and go for it. We were able to do this on an accuracy level of 99 percent, without needing to retrain the neural network and overcome the problem of generalized data.”
Even with this amount data, however, Farming Revolution is continually working to capture more. The more data, the better the insights gained from qualitative analysis. It also enables the company to troubleshoot any problems and improve the network over time.
“If a farmer has our robot, for example, and sees that on one particular field, he is not happy with the results. He can tell us, ‘please check what happened or make it work better here,” Höferlin says. “The farmer can just upload the data to us. We can conduct evaluations on this data and see where we might have some problems, and if we see that, for example the classifier underperforms, we can label the data according to what we learn and use this data in our training set as well.”
The Importance of Continual Improvement
Safe positioning and image analysis are improving every day, but the biggest question remains: Are the technologies accurate enough to reliably work autonomously? Barney Debnam, director of agribusiness solutions for Microsoft says it depends.
“It depends on the problem that you’re attacking with autonomy,” he says. “It depends on the risk profile of that particular agricultural operation.”
Debnam describes the three main risks profiles that need to be accounted for: risk to humans, risk to crops or livestock, and risk to environment.
Thinking about these risk profiles is particularly important for the companies developing solutions, Debnam says. A robotic system that follows a picker and provides a basket to carry the commodities, for example, is a different scenario than a sprayer that may apply a pesticide in a vineyard above a water area that delivers resources to the public.
“For all of us who are designing the autonomous systems and thinking about risk, it’s important to qualify the risk and understand the impact,” Debnam says,
There are several frameworks for assessing whether the technology is robust enough to deliver confident, safe outcomes. Debnam specifically names the American Society of Agricultural and Biological Engineers, The Journal of Agricultural Safety and Health, the National Institute of Standards and Technology, and the ENISA tool from the European Cyber Security Act. He encourages technology providers to use these resources. This is one of the ways technology suppliers can build trust with their customers.
Another way is to address what Debnam calls “basic surveillance.” This is a desire from food companies and consumers to get more transparency. A lot of the solutions are focused on performing a task, rather than proving information that a certain production process happened at a specific time on a specific day. When robots become fully autonomous, customers will expect more clarity on how they work and what they do.
The goal is to create a situation where everyone thrives. Mousannif expresses this best:
“We can use artificial intelligence, and we can use technology, but what’s the use in technology if we cannot create value, and we cannot make this world a better place to live?” she says. “AI and technology in general have to go hand-in-hand with social impact and value creation.”