NASA has announced that a team of researchers from its Jet Propulsion Lab (JPL) and other institutions recently visited Monterey Bay, California as part of ongoing research into developing artificial intelligence for submersible drones. In addition to benefitting our understanding of Earth’s marine environments, the team hopes this artificial intelligence will someday be used to explore the oceans believed to exist on moons like Europa. If confirmed, these oceans are thought to be some of the most likely places to host life in the outer solar system.
A fleet of six coordinated drones was used to study Monterey Bay. The fleet travelled large distances seeking out changes in temperature and salinity. To plot their routes, forecasts of these ocean features were sent to the drones from shore. The drones also sensed how the ocean actively changed around them. A major goal for the research team is to develop artificial intelligence that seamlessly integrates both kinds of data.
“Autonomous drones are important for ocean research, but today’s drones don’t make decisions on the fly,” said Steve Chien, one of the research team’s members. Chien leads the Artificial Intelligence Group at NASA’s Jet Propulsion Laboratory, Pasadena, California. “In order to study unpredictable ocean phenomena, we need to develop submersibles that can navigate and make decisions on their own, and in real-time. Doing so would help us understand our own oceans — and maybe those on other planets.”
Other research members were from Caltech in Pasadena; the Monterey Bay Aquarium Research Institute, Moss Landing, California; Woods Hole Oceanographic Institute, Woods Hole, Massachusetts; and Remote Sensing Solutions, Barnstable, Massachusetts.
If successful, this project could lead to submersibles that can plot their own course as they go, based on what they detect in the water around them. That could change how data is collected, while also developing the kind of autonomy needed for planetary exploration, said Andrew Thompson, assistant professor of environmental science and engineering at Caltech.
“Our goal is to remove the human effort from the day-to-day piloting of these robots and focus that time on analyzing the data collected,” Thompson said. “We want to give these submersibles the freedom and ability to collect useful information without putting a hand in to correct them.”
At the smallest levels, marine life exists as “biocommunities.” Nutrients in the water are needed to support plankton; small fish follow the plankton; big fish follow them. Find the nutrients, and you can follow the breadcrumb trail to other marine life.
This is easier said than done. Those nutrients are swept around by ocean currents, and can change direction suddenly. Life under the sea is constantly shifting in every direction, and at varying scales of size.
“It’s all three dimensions plus time,” Chien said about the challenges of tracking ocean features. “Phenomena like algal blooms are hundreds of kilometers across. But small things like dinoflagellate clouds are just dozens of meters across.”
It might be easy for a fish to track these features, but it’s nearly impossible for an unintelligent robot.
“Truly autonomous fleets of robots have been a holy grail in oceanography for decades,” Thompson said. “Bringing JPL’s exploration and AI experience to this problem should allow us to lay the groundwork for carrying out similar activities in more challenging regions, like Earth’s polar regions and even oceans on other planets.”
The recent field work at Monterey Bay was funded by JPL and Caltech’s Keck Institute for Space Studies (KISS). Additional research is planned in the spring of 2017.