AI and the Future of Agriculture

 

 

 

 

 

AI and the Future of Agriculture

Robots rolling through fields of corn or soybeans taking out the weeds without chemicals or physical labor? It may sound like a dream for family farmers. However, the impact will be felt far and wide in both geography and industry.

AI Agricultural Robotics with
Girish Chowdhary & Earthsense

The future of agriculture includes wider availability of Artificial Intelligence and robotics to help provide greater sustainability and address growing labor shortages. Professor Girish Chowdhary and his team at EarthSense Agricultural Intelligence granted us an interview to further discuss these topics on our podcast: Illinois Innovators. 

SPEAKERS

Girish Chowdhary
Professor, Agricultural & Biological Engineering, University of Illinois Urbana-Champaign

Chinmay Soman
Co-founder and CEO, EarthSense Agricultural Intelligence

Estefany Chavez-Ruiz
Industrial Design, EarthSense Agricultural Intelligence

Start us off by providing some background on how AI has enhanced agriculture and how it is currently impacting the industry?

GIRISH – There’s at least two ways in which it has been impacting the industry. One is in the area of converting agricultural data into actionable information. There has been a lot of work on satellite data that has been converted into maps that farmers can use as prescriptions. And the second, more emerging way, is in addressing aspects of helping the farmer getting more out of the equipment that they have, and in some cases, helping the farmer getting more done than they can do with the available labor pool that they have. For example, the tractor auto track systems that John Deere has implemented or GPS-based systems that allow us to have this perfect row crop agriculture. The second thing is robots that my group and others are developing, to create new tools for farmers who are, facing a labor crisis.

Your team was recently granted $125,000 from the Discovery Partners Institute for the Center for Research on Autonomous Farming Technologies. Can you describe the center and explain more about the work being done there?

GIRISH – We created the Illinois Autonomous Farm as a collaboration between the Agriculture & Biological Engineering Department and the Center for Digital Agriculture, which is a cross college center between The Grainger College of Engineering and Agricultural, Consumer and Environmental Sciences. The idea with the autonomous farm is for it to be a testbed of all of these AI technologies that we are developing at the university. And of course, also with our collaborators that may wish to join.

So, the farm overall has three main projects. One is improving sustainability in commodity crops: corn and soybeans. Examples include the reading robots and other types of robots that we’re working with. The second is plant manipulation. These are projects that Professor Girish Krishnan from ISE is leading, in which they’re creating robots that can harvest berries and nuts, using combination of hard and soft arms. The third is soil sensing and resilience. We’re creating the next generation of soil sensor networks working with Supratik Guha from University of Chicago that can basically measure the status of the soil and give more information.

The idea is to bring that technology and test it in real world conditions and not just in the labs. DPI was very generous to support this activity, and we’re really grateful for their support because it will enable us to actually work better with Illinois farmers and industry and retain some of our talent here in Illinois in order to build the next generation AI and robotics companies here in state.

The USDA recently announced a new center here at UIUC as well: the AI Institute for Future Agricultural Resilience, Management and Sustainability. What is it about Grainger Engineering that is driving this work?

GIRISH – It’s collaboration. AI farms was a truly collaborative project, led by Vikram Adve, a computer science faculty, along with Lisa Ainsworth with the USDA. ARS had co-PIs: Supratik Guha and Todd Mockler, and 40 other really great faculty distributed in Grainger College of Engineering and in ACES, and of course, University of Michigan, UIC, and Tuskegee University, and Danforth Plant Sciences Center.

It was a true coming together of people who work more in computer science/engineering-type disciplines and people who work more in agricultural disciplines. We found an intersection where people who were working on AI were interested in making an impact on ag; and then those are complemented by people who are working on ag problems and saw the need for AI.

As we combined these, we were able to really write a compelling story to NSF and USDA, and therefore we are one of the seven AI institutes out of which two are for agriculture. The competition was pretty tough. We’re really excited for the future of AI and ag overall at Illinois. And I think a lot of that a lot of the credit really goes to the last two years of efforts for both Grainger College of Engineering and ACES coming together for me, for example, across crop science and computer science degree.

How long have you been working on ag robotics with EarthSense and where is it going?

GIRISH – Labor is the standard challenge in agriculture. I think if you had asked farmers 600 years ago what was their biggest problem, they would have said it’s hard to find labor. Because of that, tools have been developed for farmers to make them more productive. More recently, we found chemicals. With chemicals, we’re seeing some issues. One of the first things that I found here was that Adam Davis in Crop Sciences told me about this problem of herbicide resistant weeds and how they’re really becoming a growing problem. We thought that we’d try to take our robotics technology and try to make something that would work in this herbicide resistance area. Our philosophy was to try to keep the robot as simple and as small as possible so that the logistic problems are reduced. With the help of Earthsense, we’re able to take this technology from the lab and do the real thing.

ESTEFANY – One of the things I’ve had the luxury of working on is talking to the farmers in the real world that actually have to face these issues of labor shortages and how to grow and be profitable. There’s a lot of research and technology that people throw at them, but it needs to be something that they can actually act upon, because if not there is no real value for them.

CHINMAY – That’s always been the driving philosophy at EarthSense, and by the same token, reaching back to the land grant mission of the university: identify real problems that people around the state and the world are facing and figure out what we can do at the cutting edge of science to create these new technologies that specifically solve these problems in the best possible way.

We got a huge boost from the Department of Energy’s ARPA-E program, which funded the research grant, which wasn’t looking specifically at doing things in the field with farmers, but it was about creating robots and AI that is suitable and specifically designed to work well in these outdoor, large arm environments. That was for creating this high- throughput field phenotyping technology, essentially, you know, going out and measuring thousands and thousands and thousands of plants in order to create the next variety of bioenergy crops or food.

That’s how we got an initial start into making these small robots. ARPA-E had set it up so that we didn’t just do research at the university, but also actively looked out for what problems we could solve for farmers in the public or private sector. EarthSense partnered with the university as the technology to market partner and we went out and just talked to a lot of people across the agricultural value chain, and said, okay if we had this small compact, low-cost robot that was collecting this data, would that be useful to you? What kind of sensors should we have? What kind of autonomy capabilities should we have? What kind of AI should we develop? That’s really led to this cycle of understanding what people need in the long term. Conversely, it’s driven what we can solve immediately as well. Then just iterating on that and going from strength to strength. It’s allowed EarthSense to make better and better robots, as well as bring in more grant money.

GIRISH – It’s very rewarding to work with our students who obviously have been trained by one of the best universities, but also feel very strongly connected to the land here. A lot of our students come work with us because they really like living in Champaign, Illinois, and want to build a next big robotics company here. And just to follow up on what Chinmay said, we just talked to a lot of farmers and the farmers kept telling us that drones are great, but can your drone fly under the canopy, can it go inside the field and get me all the data that I can’t see from the sky? And that was kind of the genesis of this the robot.

Robot navigates corn field as Chinmay watches

What are farmers saying about this new age, in relation to ag engineering, AI, and environmental impact?

ESTEFANY – It obviously needs to be economically viable: farmers need to make a profit. We need to make sure that the technology is at a price that they can afford. The other issue is that the work, especially in Illinois, is very seasonal. Are you going to get very qualified workers who are going to come when needed and be able to manage all the equipment? Are you going to hire them for winter? That’s also not the best choice for farmers. So robotics is a way that farmers can get the labor that they need when they need it.

How do you anticipate rolling ag robotics out for farmers?

GIRISH – We created a framework with Katie Driggs-Campbell at Electrical & Computer Engineering that can help people think about the levels of autonomy for multi-robot operation. There are already levels of autonomy when it comes to a car and driver relationship, but that doesn’t extend to this idea of one person with many robots, right? We defined different levels. Level one was basically hands off; level two was eyes off, which means you don’t have to look at the robot anymore, it can just be doing a thing; and level three is one person can handle many robots. Level four is when you no longer have to be on the field – you’re back at home and the robots are working. Level five is when you pretty much don’t program the robots anymore, you just tell them, “weed my field,” and then they go and do it.

Current autonomy on our tractors, seeders, and combines allows farmers to plan a path for the field due to GPS and other sensors. They still have to be in the tractor to make sure that it’s operating properly. By increasing the levels of autonomy, the human’s involvement is lesser and lesser and lesser.

A big challenge currently is figuring out how that actually works in the real world. There isn’t a piece of software that I know that doesn’t have a problem or doesn’t crash, right? I mean, no matter what you do, right, even airplanes, which you don’t have very expensive redundancy management systems, still require the pilot in the cabin and even drones are remotely operated. So, going higher and higher in the levels of autonomy requires not just managing the liability of failures, but also reducing the cost. At the end of the day, if I make a level three autonomous system that’s just too expensive for the farmer to afford, then it’s not really going to make an impact.

Then combine that with this vision for carbon-free or carbon-neutral: I think that’s a much bigger problem. We are only addressing a tiny drop of that. But if we switch to robots that are mainly operating off of solar energy, so they’re getting their batteries charged with solar panels, like we do on our farm: this is the vision.

CHINMAY – Illinois farmers are already some of the best farmers in the world. Our yields are basically at the top of any yields in the world. So we have to remember the problem that we’re trying to solve. Farmers are already invested in fantastic amounts of technology – I wouldn’t be able to drive a modern tractor. So, the philosophy at the university, as well as at EarthSense, has always been to figure out what are the large equipment investments that the farmers already have? What are they really good at doing?

he large agricultural technology companies have been working for the past hundred years, basically, to create the best tractors that they can. We are not trying to replace the tractors or change the things that farmers are already good at doing: planting, harvesting, applying fertilizers or pesticides in a large scale way. What we are trying to do is make sure farmers have the full set of tools available to them to improve their efficiency even further. Because on economic metrics, farmers are generally not doing so well. There’s a year here and there where farmer’s incomes are good, but mostly, farmers have to really struggle to keep the ship going. And what we think is needed to improve the sort of economic profitability, as well as long term viability of agricultural enterprise, is to create new forms of agricultural equipment that improve overall efficiency.

It all comes down to farmers: sitting down with them and understanding their workflow. Where it kind of breaks down a little bit is there aren’t enough people to go through the fields, and when a farmer is managing 2,000, 3,000, 10,000 acres, they just don’t have enough people to go through those fields and say, okay, you know, can I improve the yield on this field a little bit? Do I need to apply more nitrogen? Where do I need to apply, you know, fungicide or pesticide?

We had a good, solid 30-year run of being able to use glyphosate, being able to use Roundup, but that’s becoming a problem as well. We used Roundup and other herbicides to the extent where nature has started fighting back and weeds are starting to become more and more resistant, and they’re spreading more rapidly. That’s causing some farmers to hire a chopping crew at $65 per acre Doing that doesn’t make sense for that year, but if they want to keep that field viable for the future, they have to bite the bullet.

Those are the kinds of problems large equipment and satellite data are not going to be able to solve.

Chinmay instructs student on how to control robot

What is the biggest challenge in the engineering of these modular robotics?

GIRISH – Robots are constrained to be on the ground and they can get more done because of that. But they have to deal with a lot of obstacles – they aren’t flying over the obstacles like a drone. When we started designing the robot, we started from the bottom up. We asked what’s the lightest robot we can make? That’s the easiest to control and to make autonomous and we built it up from there. We plan to make it as big as it needs to be. But we won’t make it big just because there are big tractors. We believe that the small can do a lot of work.

So our engineers figured out ways to push a lot of power in those small things. Plus, they’re very rugged. One key engineering challenge was designing the form factor. The second, bigger, challenge is the development. It’s now the job of industry, not the big industry, but the nimble young entrepreneurial industry like EarthSense, to take it from there and really make it happen in relation to customers. A focus is on doing what Estefany’s doing – listening to customers and building the product around that.

What we have to do now at the university is think about the AI. If one person is controlling many robots, the person is no longer on the field. The robots have to be smart enough to understand how they’re feeling, where they’re feeling, what can they do to recover? Because we build small robots, the consequence of failures are decreased. So if the robot makes a mistake, it maybe hits the corn, right? But it can recover. As opposed to big equipment making a mistake, it could be really costly.

Now that we’ve created this machine that is easier to control, what AI do we need for these machines to have distance between interventions, or time between interventions of days? And months? How can these machines learn from each other so that they can be deployed across multiple fields, multiple crops? And how can they get their job done in the best way? How can they plan not just to go up and down, but really figure out the hotspots in the field and treat accordingly?

Where will this field be in the next five or 10 years?

GIRISH – One of the goals of AI farms is to find the foundational research goals. We found about 17 things, including AI for control, AI at the edge, AI with multiple different data sources, etc. As we advance the foundations of that AI, that’s going to have very broad impact.

Then as we make more and more intelligent and robust robots that operate in the outdoor environments, that’s going to have impact on other domains, such as disaster recovery, mining, construction, monitoring, defense patrolling – all kinds of domains that require outdoor robots. Right now, our robots are kind of confined to the factory floor. This is a technology that liberates them. This will take robots out into the world.

Then of course, how we change agriculture will really impact how AI can be used to change other industries that are also kind of old industries that are very successful and foundational to our existence, but could use a helping hand.

***