Do you like a planet that hasn’t yet melted? Do you like sushi? How about breathing? Then you’re secretly in love with plankton, tiny marine organisms that float around at the mercy of currents. They sequester carbon dioxide and provide two thirds of the oxygen in our atmosphere and sacrifice themselves as baby food for the young fish that eventually end up on your plate.
Yet science knows little about the complex dynamics of plankton on ocean-wide scales. So researchers are asking the machines for help, developing clever robots that use AI to examine and classify plankton, the pivotal organisms at the base of our oceanic food chain. That kind of work will be critical as Earth’s oceans continue to transform, potentially throwing ecosystems in chaos.
Take IBM’s ocean-going microscopes—which, conveniently, leverage the same technology sitting in your pocket right now. Two LEDs sit a few inches above the same kind of image sensor you’d find in a smartphone. When plankton pass over the sensor, they cast two shadows. “So by taking two pictures, one with each LED, you can get the 3-D position of all the plankton in a drop of water on the image sensor,” says Tom Zimmerman, a researcher at IBM.
So you’ve got an image of some plankton, which could be one of two types: zooplankton are animals like fish larvae, and phytoplankton are marine algae. The old way of identifying them—there are over 4,000 species of phytoplankton alone—used to be to sort through it with the eyeballs of a human expert. But now researchers have artificial intelligence: IBM is working to integrate AI into the system to automatically quantify and identify the specks. The idea is to create a floating instrument that dangles hoses of different lengths so it can sample plankton concentrates at different depths. A network of these microscopes could then alert scientists to anomalies as they unfold in real time.
Take, for example, the misadventures of a zooplankton called a copepod. It eats algae, which can contain a toxin that gets it drunk. “Now, you think that would be fun for the copepods, but it isn’t, because usually copepods dart around in random directions which helps them avoid being eaten by their predators,” says Zimmerman. “But when they get drunk they go straight and fast, which makes it really easy for them to get picked off by their predators.”
So the local copepod population starts to crash, and the algae population in turn explodes, the phytoplankton poisoning themselves with all their waste products. They die and release toxins that poison other organisms, and suck all the oxygen out of the water as they decay. Now you’ve got a whole lot of dead critters on your hands. “That’s a case where watching the behavior [of plankton] would indicate that there’s some imbalance,” says Zimmerman. “That’s the kind of stuff we have to monitor.”
The system can at the moment track plankton concentrations. But it’s not just about quantifying the amount of plankton in a given area—it’s about decoding the balance between the zooplankton that eat phytoplankton, and how the organisms are behaving individually and as part of a group. IBM eventually wants to track things like drunken copepod movements in real time; it’s still building a library of plankton, but hopes to have a system of devices in the wild within five years.
Scientists have to consider shape, too. A giant single-celled organism called a stentor, for example, is normally trumpet-shaped, but will ball up when exposed to too much sugar. “So behavior, shape, these are all things that with AI we can definitely track to understand if something is going wrong,” says Simone Bianco, a researcher at IBM.
IBM isn’t the first to enlist AI in the quest to better understand plankton. The excellently named FlowCytobot sticks to piers and sucks in water, which passes through a laser. Particles like plankton scatter this light, which triggers an imager.
The system judges the images based on some 250 features, like symmetry. “Then through manual classification, where the user creates an image training set of hundreds of images at a time, the neural net learns to identify those plankton without user input,” says Ivory Engstrom, director of special projects at McLane Research Laboratories, a scientific instrument company that makes the FlowCytobot.
The FlowCytobot alerts scientists, like these studying algae blooms in Texas, to events like the outbreak of toxin, but it’s tethered in one place. Over at the Monterey Bay Aquarium Research Institute, scientists are working on a more mobile platform for monitoring plankton: the Wave Glider. Think of it like a very expensive surfboard, loaded with solar-powered instruments.
MBARI researcher Thom Maughan is developing his own microscope that’ll allow the Wave Glider to sniff out plankton. This data will be made publicly available through MBARI’s Oceanographic Decision Support System. “When we show the Wave Glider in its position out there, you’ll be able to hover your mouse over it and get some idea of the size distribution of the microorganisms that the microscope is seeing,” says Maughan. “Then you should be able to drill down and see what types of organisms are being identified.”
This kind of automation isn’t just about convenience—it’s about necessity. “It’s getting to be a rare person that can identify the plankton,” says Maughan. “Those are the old-school traditional microbiologists. Apparently they’re getting to be fewer and fewer of those folks who are really intimate with that plankton world.”
With the oceans undergoing rapid transformation, science can’t afford to lose this knowledge. Plankton are all too important, and still all too mysterious. Leave it to the machines, though, to help make sense of a confounding ocean kingdom.