Similarities in biology and technology inspire novel approach in mobile robotics and prosthetics for spinal cord injuries

Author: Nina Welding


One of the first few scenes in “Chariots of Fire” cuts to a group of Olympians running along a beach with a sweeping instrumental playing in the background. Watching that group of athletes move together to the music, it’s easy to see the rhythm in what researchers call synchronized locomotion. This synchronization of limb movement is precisely what engineers and scientists have been trying to create as they work to develop autonomous robots and exoskeletons for treating patients with spinal cord injuries, devices that offer more responsive, more natural limb movements and are energy-efficient.

Since 1961 when the first modern evidence of the central pattern generator (CPG) was verified, neurobiologists have been studying its functions. Most recently, they identified a close resemblance between biological locomotion gaits and the phase patterns in coupled oscillatory networks. This resemblance is what inspired the work currently being conducted at the University of Notre Dame in collaboration with the Georgia Institute of Technology. 

What is the CPG? A group of neural oscillators located in the spinal cord of vertebrates and in ganglions of invertebrates, the CPG produces rhythmic patterns like locomotion, breathing and chewing — physical functions that harmonize almost involuntarily. One of the unique qualities of the CPG is that, even though it receives simple input signals from the higher regions of the brain, those signals depend on small autonomous neural networks (local areas) to generate patterns rather than the whole nervous system. This is why a person automatically adjusts her gait when she feels a pebble in her shoe. The “feedback” that there is a pebble is a fact the entire body knows, but the foot adjusts without waiting for specific additional input from the brain. There is no time delay in the body’s motor control, and that’s one of the key elements that is still missing in today’s autonomous robots and exoskeleton development.

A team led by Suman Datta, the Stinson Professor of Nanotechnology and director of the Applications and Systems-driven Center for Energy-Efficient Integrated Nano Technologies (ASCENT) and the Center for Extremely Energy Efficient Collective Electronics (EXCEL), is using novel bio-inspired hardware to mimic the way the CPG works. In fact, his team has successfully demonstrated the hardware for low-power compact nano-oscillators that are bi-directionally coupled using capacitors. “We have created a compact and energy-efficient CPG hardware that will function as the neural oscillators do in the body,” said Datta. “Because our hardware acts as a decentralized distributed locomotion control of a robot or exoskeleton, each joint or limb can be controlled locally by the CPG network. Then using feedback, the entire robot or exoskeleton can adapt to its surrounding environment.”

According to Sourav Dutta, postdoctoral research associate in the Department of Electrical Engineering, one advantage of the CPG feedback mechanism is that a robot, for example, could compensate if one of its limbs were damaged. It would still be able to walk with only a slight modification in the walking style. But the modification would happen in real time just as it would for a biological entity.

Big advantages Notre Dame researchers have built into their CPG hardware are compactness and energy-efficiency. Acting like neurons and synapses, the nano-scale oscillators used by the team work at low current and low voltages (approximately 1 volt) and are connected with an extremely scaled transistor. The oscillators are coupled with simple capacitors, which also saves energy. Previous work in this field has involved bulky “neurons and synapses” rather than nano-oscillators and numerous transistors.

“Our successful demonstration of the Notre Dame CPG hardware was the first step,” said Dutta. “However, locomotion control (action) works hand-in-hand with perception (how information is extracted from the environment) and decision (how machines, in this case, learn what logical action to perform). Along this line, we are now exploring how to interface the CPG hardware with visual, tactile and other sensors and to perform real-time learning and actuation using feedback signals.”

In addition to Datta and Dutta, team members include Wriddhi Chakraborty, Jorge Gomez, Benjamin Grisafe, Matthew Jerry and Abhishek Khanna from Notre Dame, and Abhinav Parihar and Arijit Raychowdhury from the Georgia Institute of Technology.

The project was supported by the National Science Foundation and the Nanoelectronics Research Corporation, a subsidiary of the Semiconductor Research Corporation, through EXCEL. For more information, visit

Originally published by Nina Welding at on Oct. 1.