Skip to Main Content

When researchers need to collect data about a work site’s soil quality, they can run into several problems. Using excavators to access sites can be challenging, expensive, and may have limited ability to collect samples. Once an excavator does return a scoop of soil, scientists must still collect soil samples by hand for shipping and analysis at an off-site commercial lab. In short, their work is often dirty, dangerous, and expensive.

But what if they could run a googly-eyed robot the size of a small dog along the same route? It navigates to the site alone as its wheels help it pass over stones and other obstacles. Once it takes a sample, its sensor can analyze the sample in real time, allowing its artificial intelligence-enhanced algorithms to determine where the most ideal location is to take the next sample, so it only takes as many as it needs. When it is done, it navigates back to the researchers. This time, it searched out and measured the presence of chloride in the soil to test for salt concentrations, though in the future, it will be able to sense other substances.

\

This robot is part of a multi-year project by an interdisciplinary team of Carnegie Mellon researchers, led by Greg Lowry and Aaron Johnson and in collaboration with Chevron, seeking to understand how robotics and artificial intelligence (AI) can help engineers address environmental challenges.

The team has effectively created a new field at the junction of environmental engineering and robotics. Engineers have been increasingly applying robotics to agricultural research and to monitor different environments, such as underwater and aerial imagery. However, Johnson, associate professor of mechanical engineering, said that this team’s focus, which specifically applies robotics to monitoring affected soils, is unique.

“This is often where the exciting research happens, in this intersection of two fields,” he said.

This robot and research published in the Journal of Environmental Management, stand as a “proof-of-concept” for the project as a whole to show that these fields really are complementary. Specifically, the team has sought to answer questions such as: how can we go from a certain study objective to a specific assisting robot? Is it even possible to take a sensor out to a site with a robot?

This is often where the exciting research happens, in this intersection of two fields.

Aaron Johnson, Associate Professor, Mechanical Engineering

The authors detail a method of designing robots so that other teams can follow in their footsteps (or wheel-tracks) to meet their own objectives. Johnson and Lowry suggest using four design components to guide decisions: sensing, which quantifies a substance in a sample; sampling, which extracts and processes samples; mobility, which moves the robot around the site; and autonomy, which determines relevant locations and how best to navigate to them.

“Other parts of the project include looking at the exploration algorithm to improve the autonomy, trying out different sensors, different locations, and applying the same strategies to different problems,” Johnson said.

The project has taken several years of careful research, as well as dealing with limitations within a worldwide pandemic, to get to this stage. Therefore, it has involved a number of personnel with a wide range of experience, from undergraduates to postdoctoral researchers. They have also partnered with HEBI Robotics, which has developed a “ruggedized” version of the lab-made robot that was featured in the paper.

“We are now looking to move beyond this proof-of-concept stage into more concrete objectives and demonstrations,” said Lowry, professor of civil and environmental engineering. Currently, the team is focusing on the autonomous systems and algorithms component: the robot determines the best location to take its next sample by understanding the results it has taken so far. This process is called adaptive sampling.

Vivek Thangavelu, a postdoctoral researcher in Johnson’s laboratory, is working on adjusting the adaptive sampling networks to handle different situations. Currently, the robot works best when it is mapping out a single, distributed area that is affected, but it is not as effective when substances are concentrated in multiple smaller “hotspots,” a scenario sometimes found at affected soil sites. In practice, before engineers begin their study, they do not know how the substance of interest is distributed at the site, so the robot must be able to deal with all manner of environments and parameters.

“How do you categorize those kinds of environments?” Thangavelu said. “How do you say, ‘Okay, we have an algorithm, but when does it work and when does it not work?’”

As the project develops, the team plans to apply its principles and the advances they have made to other research goals, such as helping farmers determine how much fertilizer to use in a field and where to use it, and to identify invasive plant species.

Thangavelu enjoys the applicability of the work, as well as its relevance to current environmental issues. “I enjoy working in a field related to the environment. We have to think about problems with respect to how to deal with the effects of climate change.”