DESIGNING ROBUSTLY INTELLIGENT SYNTHETIC NERVOUS SYSTEMS
Through a four-year, $325K National Science Foundation grant, professor of biology Josh Martin, with collaborator at Case Western Reserve University Roger Quinn, are expanding the scale and sophistication of synthetic nervous systems (SNS) that will be implemented to control their six-legged MantisBot–endowing it with online learning and intelligent autonomy. Martin and Quinn believe a structure-math-function approach to computational neural modeling is provably more stable and robust than typical machine learning methods. Their approach has already resulted in several walking models and robots.
To endow a machine with robust intelligence, Martin and Quinn look to the most successful phylum of animals on the planet: Arthropoda (e.g. insects, crustaceans, and spiders). Arthtopods’ intelligence rises from their highly distributed nervous systems, which directly enable them to produce adaptive locomotion and sophisticated decision making.
Research shows that praying mantises are masters at movement and articulation, making them attractive for implementing a bio-inspired design–an old biology principle where finding the animal that does something best and learning how that animal does it is the way to get something done. Additionally, the praying mantis is an ideal model organism for studying decision making. It is known that mantises adapt more aggressive hunting methods when their hungry, suggesting it may risk morebeing eaten to pursue prey if sufficiently motivated. Understanding how this decision is made to balance risk and need is critical for an autonomous robot in a real world environment.
““We want to take how the mantis controls its body and turn that into a computer program that functions in a similar way, and use that to control the robot.”
— Josh Martin, Professor of Biology
Martin’s team implants electrodes into a mantis’s brain to monitor neural activity while it stalks simulated prey to determine which cells respond to prey. At the same time, students watch videos of the mantis as it hunts and carefully measure how each of the insect’s six legs move. These movements are matched up with the recorded neural activity to map leg motions to brain activity. To endow legged robots with robust intelligence, Martin and Quinn propose that intelligence be added in a bottom-up fashion–highly organized low-level networks should produce complete motor patterns, which can be modulated by inputs from descending interneurons.
In the end, Martin and Quinn’s goal is to enable their legged robot to walk robustly in rough terrain, autonomously seek rewards while avioding harm, and react appropriately to never-before-seen scenarios.