So says Colby neurobiologist Josh Martin, who recently received a $325,000 grant from the National Science Foundation to deepen the study of intelligence as it applies to robotics.
His subject? The praying mantis.
Martin, assistant professor of biology, and his collaborators will use the four-year grant to make robots behave like mantises, which researchers have found are masters of movement and articulation.
“We want to take how the mantis controls its body and turn that into a computer program that functions in a similar way, and use that to control the robot,” Martin said from his lab in the Arey Life Sciences Building where he breeds the sensitive creatures, often called prima donnas for the care they demand.
The ultimate goal is to develop a robot—MantisBot2—building upon Martin’s previous work at Case Western Reserve University. The original MantisBot mimics the mantis’s central nervous system with a processor (the brain) and wires (spinal cord) going out to the limbs.
Martin’s NSF grant will improve the MantisBot’s brain for more nuanced control and with increased intelligence to make independent decisions, like a mantis. And MantisBot2 will have more detailed legs than the first iteration, made from scans of actual mantis legs that are in turn created with a 3-D printer.
Mantises may seem unlikely role models, but their adaptability and intelligence make them attractive candidates. It’s called bio-inspired design, Martin said, that comes from an old biology principle—if you want to see how something’s done, find the animal that does it best and learn how that animal does it.
Just as a mantis identifies, stalks, and grabs its prey, a robot in a warehouse, for example, could see a box across the room, navigate to it, pick it up, and move it. Or a home robot could identify and retrieve an object dropped by an elderly person.
We know that mantises are intelligent, Martin says, because of the choices they make in two different physiological states: hungry and sated. Hungry mantises chase their food around, whereas sated mantises wait for prey to come closer. “It doesn’t chase things if it doesn’t need to eat,” Martin said. “It’s a smart thing to do.”
Martin’s team implants electrodes into a mantis’s brain to monitor neural activity while it stalks simulated prey to determine which cells respond to prey. At the same time, students watch videos of the mantis as it hunts and carefully measure how each of the insect’s six legs move. These movements are matched up with the recorded neural activity to map leg motions to brain activity.
One goal of the NSF grant is to record more of these hunting sessions to create a detailed model of the brain. The grant will also allow Martin to build a virtual cave to video the mantises in three dimensions.
“We want to understand how seeing where a target is, and then going to that target, is done in the brain,” Martin said.
The model, or data, will be shared with Colby’s Computer Science Department, which will reassemble the data in a computer program, test it, and provide feedback to Martin about what works and what doesn’t.
“A lot of people think it’s kind of weird we’re studying mantises,” said Andrea Velazquez ’19 of Salem, Ore. “But when I tell them the actual experiments we’re doing on them, they think it’s amazing.” Velazquez, one of Martin’s research assistants, calls the opportunity to conduct this level of research as an undergraduate “an important part of Colby.”
The research is important in making intelligent agents and understanding what intelligence means. Martin’s team—insects and humans—aims to do both.