Purely on the Promise That We Could Do Great Things

There are beautiful days in central Illinois in mid-October. Days when the sun’s shining and the air’s just crisp enough to remind you it’s Fall. Football weather. Bonfire weather. Trip to the orchard weather. Or – if you’re a Grainger Engineer and you’ve already done your apple picking for the season – weather that’s perfect for visiting an autonomous farm.

Spring 2021

This wasn’t one of those days. Forty-five degrees, overcast, starting to rain. We put on our hats, gloves, and jackets, anyway. With a quick check of the Safer Illinois app to verify that we had up-to-date COVID tests and after a reminder to keep our masks on, we headed out with affiliates of the University of Illinois’ Center for Digital Agriculture on a tour.

The Center for Digital Agriculture is helping researchers, educators, farmers, and industry keep pace with enormous changes to the world’s food systems and improve how they feed and support a growing global population. It was launched in 2018 with a $2 million investment by the U of I.

“This investment was purely on the promise that we could do something great that truly spanned disciplines. The university has always placed a great premium on collaboration, and you see that with a whole universe of giant, highly collaborative institutes that we’ve built over the last 50 years like the Coordinated Science Lab and the Beckman Institute,” said Computer Science Professor Vikram Adve, who heads the Center for Digital Agriculture along with Crop Sciences Professor Matthew Hudson.

The autonomous farm – recently established by the Center for Digital Ag and the Department of Agricultural & Biological Engineering – is currently about four acres. That’s more of a garden by central Illinois standards. But it’s part of the University of Illinois Urbana-Champaign’s vast “South Farms,” so there’s ample room to grow. It’s planted mostly in corn, but it also includes more delicate, tougher-to-pick fruits and vegetables like tomatoes and strawberries. There were even some potatoes in the ground this fall.

Our tour was focused on a set of robots that are bringing autonomy to agriculture. The low, four-wheeled vehicles, not much bigger than your office trash can, are outfitted with cameras, LIDAR, and GPS that allow them to navigate the rows of corn. Because they can travel the crops with limited human guidance, they can take on a variety of jobs like high-throughput field phenotyping, weeding, and picking. They’re still prototypes in many cases, but faculty members are already using the robots in genomics research and spinning them off into commercial products. (See “AI and the Future of Agriculture” in this issue of Limitless for more about Professor Girish Chowdhary’s company EarthSense.)

“A field is so much harder than a factory,” Professor Nancy Amato said as one of the robots bumped through the mud. Amato is head of the Computer Science Department and the Abel Bliss Professor of Engineering. She’s a leader in what she calls “simple, abstract methods and algorithms” for robots and automation.

“We have the ability to get robots out of very structured areas like a factory and into areas with dynamic changes and uncertainty. That’s where we can make real breakthroughs. Places where we’re applying things outside of our own research areas.”

The Center for Digital Agriculture is helping researchers, educators, farmers, and industry keep pace with enormous changes to the world’s food systems.

The team expects to work across the gamut of CS-related areas.

“We know there are enormous changes coming to agriculture and that those are going to be driven by a greater role for digital technologies – AI, automation, machine learning, sensors, cloud computing,” said Adve, who is also the Donald B. Gillies Professor in Computer Science.

Farmers and other agriculture experts often have been early-adopters of digital technologies, according to Adve. This comfort with technology can be seen in everything from “self-driving” tractors to variable-rate application techniques that allow farms to precisely use water, fertilizer, pesticides, and herbicides as needed. It’s still relatively uncommon for computer scientists to embed in agriculture, however.

Hudson, co-director of the Center for Digital Agriculture, believes we are living in a moment that is particularly well suited for a shift away from that mismatch, however. He pointed to the genomics revolution as an illustration.

Twenty years ago, genomics, bioinformatics, and molecular biology were what he called “data-limited” fields. Mapping an organism’s genome – simply generating the data necessary to identify a genetic modification that would improve crop yield or to identify a target for a drug that might cure disease – was laborious, slow, and expensive. As a result, “most research in agriculture, at that time, didn’t require powerful computers or advanced computational methods. Now, however, we’ve gone from data-limited to data-analysis limited.”

“The genomics revolution is almost complete. It’s no longer a research effort. We can produce the data and so we can build tools to analyze it and create applications that take advantage of it,” Hudson said. “The next boom like that is in AI, robotics, and data science applied more generally to agriculture.”

The Center for Digital Agriculture anticipates that digital transformation of farming. It also positions the University of Illinois Urbana-Champaign to lead it. AIFARMS, or the Artificial Intelligence for Future Agricultural Resilience, Management, and Sustainability Institute, is an incredibly quick indication of what UIUC is capable of in the field.

AIFARMS brings together world-class scientists, PhD students, postdoctoral researchers, extension specialists, and diverse industry partners to address major agricultural challenges such as labor constraints, animal health and welfare, environmental crop resilience, and soil health. That’s where the autonomous farm comes in – reflecting a world in which low-cost AI-driven systems enable breeders and farmers to achieve large improvements in yields and profitability with minimal or even positive environmental impacts. The institute combines deep research expertise with strong education and outreach programs in digital agriculture to grow a diverse workforce with AI skills, reach rural and other underserved populations, and create a global clearinghouse to foster community-wide collaboration in AI-driven agricultural research. It includes partners from Michigan State University, Tuskegee University, the University of Chicago, the University of Illinois, the Danforth Plant Sciences Center, EarthSense Inc., IBM, and Microsoft.

AIFARMS is funded by $20 million from the National Artificial Intelligence Research Institutes program. The program, a joint effort between the National Science Foundation and the U.S. Department of Agriculture’s National Institute of Food and Agriculture, supports AI research that impacts and improves society. The program supported seven new AI institutes in 2020, and Grainger Engineering faculty lead two of them. In addition to AIFARMS, Professor Huimin Zhao, Steven L. Miller Chair of Chemical & Biomolecular Engineering, leads the $20 million NSF Molecular Maker Lab.

“We know that AI has incredible power to alter and improve human capacity. It can improve crop yields, help address labor shortages, reduce large-scale farming’s negative environmental impacts, and increase profitability for small-scale farming,” Adve said. “The core question is how?”

That’s just what the Center for Digital Agriculture, AIFARMS, and UIUC’s autonomous farm will define for years to come.

AI Agricultural Robotics with
Girish Chowdhary & Earthsense

The future of agriculture includes wider availability of Artificial Intelligence and robotics to help provide greater sustainability and address growing labor shortages. Professor Girish Chowdhary and his team at EarthSense Agricultural Intelligence granted us an interview to further discuss these topics on our podcast: Illinois Innovators. 

SPEAKERS

Girish Chowdhary
Professor, Agricultural & Biological Engineering, University of Illinois Urbana-Champaign

Chinmay Soman
Co-founder and CEO, EarthSense Agricultural Intelligence

Estefany Chavez-Ruiz
Industrial Design, EarthSense Agricultural Intelligence

Like a Thumbprint

AIFARMS research predicts corn’s resilience to climate change.

Less than a mile southwest of the autonomous farm, as the drone flies, there’s another test plot. If you don’t have a drone, take Race Street to Windsor Road to First Street. You’ll be there in five minutes.

Individual circles of corn about 70 feet in diameter cover the plot. Each is ringed with a set of tubes, stacked from ground-level to a few feet in height. (Imagine the wooden fence around a horse paddock, but instead of wooden slats, think of what looks a lot like garden hoses running between the fence posts.) Sensors and electrical boxes pop up at intervals. Flags fly and anemometers spin to track the wind speed.

It’s called the Free Air Concentration Enrichment Experiment or FACE. On this plot, researchers from the University of Illinois and the U.S. Department of Agriculture grow crops under typical field conditions while altering the levels of carbon dioxide and ozone, the air temperature, and the water available in the soil. For more than 20 years, they have used it to track how changes to these conditions impact the physiology, molecular make-up, and genetics of corn and soybeans.

“It’s a climate change experiment,” said Dr. Lisa Ainsworth, a plant physiologist with the USDA’s Agricultural Research Service. A member of the National Academy of Sciences, Ainsworth and her team do much of their work using FACE in partnership with many UIUC researchers. “We study how crops respond to climate change, and then we study different ways that we can either breed crops to increase their productivity in a more polluted world or identify alternate systems to put in place.”

In previous NSF-funded research, Ainsworth grew about 200 types of corn with different genetic profiles under elevated ozone conditions. The team then used hyperspectral cameras to capture what range of light wavelengths the leaves of those corn plants reflect from the sun. Called reflectance spectra, that data can be used to tell researchers how much nitrogen or chlorophyll is in the leaves, which serves as an indicator of the overall health of the plant. The better a corn plant’s health under the elevated ozone, the more hardy that strain is likely to be as climate change increases the natural ozone in the air on any given farm around the world.

As part of AIFARMS, Ainsworth is working with USDA’s Dr. Carl Bernacchi, Professor Kaiyu Guan of the Department of Natural Resources & Environmental Science, Professor Jingrui He of the School of Information Sciences and Department of Computer Science, and Professor Andrew Leaky of the Department of Plant Biology to apply new machine learning techniques to the process of predicting plant health traits from reflectance spectra. Machine learning allows them to take the spectral and genetic data from, say, 200 hundred corn samples to infer the resilience of other strains of corn. It also allows them to infer other important traits about the crops such as photosynthetic capacity or leaf mass.

These insights can then be used by seed companies to make decisions about how they’re going to create the next generation of crops that are resilient to climate change.

“The models are so complex,” Ainsworth said. “You might have thousands of signals that make up the reflectance spectra. It’s almost like a thumbprint. A thumbprint is made up of a bunch of swirls and ridges put together to make it look like it does. There are a thousand pieces of information inside that single fingerprint.”

The information from those thumbprints can be captured at a variety of scales. Drones, airplanes, and satellites all routinely fly over farms and can be rigged to capture that sort of hyperspectral data. Ultimately, the team wants to be able to use their AI techniques with high-resolution data that covers a very small area – like the 70-foot circle in the FACE experiment – or with lower-resolution data that covers a much larger data such as the large swath of a region that can be captured with a satellite 1,200 miles above the earth’s surface. And fusing data at multiple resolutions is yet more powerful.

Artificial intelligence will also help the team apply their technique across a range of species and field environments. “We’d like what we measure on our field to be transferable to my parents’ field in Mason County to be transferable to Iowa to Nebraska,” Ainsworth said. “That’s the sort of research that we can tackle with AIFARMS.”

Pig illustration

 

Starting from Scratch

AIFARMS research targets the ‘health-welfare-performance trifecta’ on commercial livestock farms.

Another AIFARMS team – this one focused on livestock – conducts some their research in three swine labs spread across the South Farms. But to really get a sense of things, one of the thrust’s leaders, Professor Angela Green-Miller, recently took a group of computer science faculty and students to an Illinois-based commercial pig producer about an hour from Champaign-Urbana. Part of the Department of Agricultural & Biological Engineering, Green-Miller designs housing systems and management strategies that simultaneously improve animals’ welfare and use farm resources more efficiently. During her career, she’s worked with horses, laying hens, pigs, cattle, and lab animals.

She’s tagged them with RFID sensors, tracked environmental conditions in their transport trailers, and even fed them temperature-sensing “pills” to track their body temperatures. She uses the animal as a biosensor to understand how its environment influences its health-welfare-performance state.

“We can take our collaborators to our research farms, but that doesn’t really give you an idea of the scope or scale of the space we’re looking to solve problems in,” Green-Miller said.

The team includes Narendra Ahuja who is a Donald Biggar Willett Professor Emeritus in Electrical & Computer Engineering, Professor Brian Aldridge from the College of Veterinary Medicine, Professors Isabella Condotta and Ryan Dilger from the Animal Sciences Department, and Professors Vikram Adve, Heng Ji, and Alexander Schwing (a WJ Jerry Sanders III Faculty Fellow) from the Computer Science Department. Olga Bolden-Tiller, head of Tuskegee University’s Department of Agricultural and Environmental Sciences, is also a core member. 

Most had never visited a farm like this one, and they were in for a surprise. The farm is home to 6,000 sows. Each gives birth to about 14 piglets twice a year. The farm has six employees, according to Green-Miller. I was surprised by that too, so I had her repeat it. “Six employees. That’s not an unusual number. It’s not that they don’t want to hire more people. It’s very, very hard. The workforce is just not there.”

“Imagine on a farm of that size,” she continued, “how many seconds of the day you lay eyes on an animal. It’s not very much.”

That lack of staff – that lack of eyes on animals – is central to the team’s AIFARMS work. It’s not about taking people off the farm. There’s plenty of work to be done. It’s about building computer vision tools and techniques to better inform those employees of where they need to be and what tasks need to be done in that space.

Based on decades of experience on both the artificial intelligence side and the animal health, welfare, and performance side, they are developing techniques for placing cameras in livestock facilities such that they are robust and provide the appropriate amount and type of footage to make smart decisions. They are also creating edge computing methods for processing that footage and moving as little of it as possible to the cloud. Finally, they are developing algorithms to manage the flow of that data and to make successful predictions about the animals, and, more specifically, the animals’ care needs.

These AI methods will help predict the health of animals and their welfare and growth. They will also help count the pigs as they are moved from space to space on the farm. Computer vision will be focused on training the workforce, as well.

“We’re looking for the health-welfare-performance trifecta,” Green-Miller said. “For example, if you have an animal that is suddenly resting or sleeping a lot, that can be an indication that that animal is experiencing some sort of a challenge. If it is breaking with an illness, we can get employees to that space early. Or, just before and after the weaning process, that’s a very delicate period for development. We’re going to initially focus there in order to identify opportunities to build resilience. If we can identify animals that aren’t coping well, we can apply interventions and improve outcomes for animal welfare and production in very positive ways.”

There are a lot of hurdles to meeting those goals. Cameras get in the way of workers or their line of sight is obscured by the movement in the barn. Pigs are tough and tend to destroy things. Internet connectivity in rural areas isn’t fast enough to simply pump every bit of video to the cloud for processing. In other words, a farm just isn’t a research lab.

“The really cool thing in AIFARMS is that we have tremendous techniques that we’ve used in research, but we’re not trying to bring the tools we’ve already developed and try to make those work,” Green-Miller said. “Because they’re not going to work. They’re not built for that. We’re taking the experience of our team to develop tools specifically for commercial use. There have been a number of technologies that have failed because they were more burden on the producers than the problem that they solved.”

“Instead, we’re going to start from scratch and incorporate input from producers as we go. We don’t want to build something they don’t want. We want to solve a problem that’s a real problem and build a tool in a way that meets the needs of industry.”

Illustration of student on laptop

 

More CS People Getting into Ag

Starting in the fall of 2021 incoming first-year students will be able to enroll in a program that combines computer science and animal science.

“The way the industry is moving, our students need experience handling large datasets, bioinformatics, genomic information, and data from remote sensors. Having a background in coding, programming, and advanced statistics will make them highly sought-after in today’s market,” said Professor David J. Miller from the Department of Animal Sciences at the University of Illinois.

The CS+Animal Sciences program is just getting under way. But The Grainger College of Engineering has been addressing the changing reality in industry that Miller describes for several years now. Collaborating with the College of Agricultural, Consumer, & Environmental Sciences, it offers a CS+Crop Sciences degree that launched 2018.

In all, the Department of Computer Science works with 11 departments to offer CS+X degrees in a variety of fields. They offer roughly equal numbers of credit hours in computer science and the “plus X” discipline that a student is working in. Nearly 1,000 students are currently enrolled in CS+X degrees across the university.

The Center for Digital Agriculture is also developing a master’s degree in digital agriculture. It will be mostly online and will advance the technical skills of both new students and working professionals.

Meanwhile, the Department of Computer Science recently launched a one-year certificate program focused on broadening participation in computing.

Called the Illinois Computing Accelerator for Non-specialists, iCAN is for students who have a bachelor’s degree in any field other than computer science. It embraces a diversity of experience and breadth of thought that will enable participants to impact high-tech industries like agriculture through their unique perspectives.

The University of Illinois and Tuskegee University are working together closely as part of the AIFARMS Institute to enroll Tuskegee students in the iCAN program. Tuskegee is among the largest historically Black colleges and universities in the country, and it is routinely in the top five for graduating African American agriculture students and engineers.

Professor Tiffani Williams is leading the development of the program. “iCAN is an innovative program offered by a top-five computer science program that will empower non-CS college graduates to obtain a Computing Fundamentals Certificate. Our focus is to provide a supportive environment, which integrates individualized instruction, hands-on training, and academic and career mentoring,” she said.

iCAN welcomed its first students in August 2020, and the team expects classes to grow substantially in the coming years.

“Lots of ag people know CS very well and engage very deeply,” Adve said. “But we need more CS people getting into ag.”

New, innovative degree programs are helping do exactly that.

***