
At the Bio-Inspired Robotics Meetup held in London this week we heard a series of talks exploring the latest technical advancements and applications of bio-inspired robotics. Speakers covered membrane wings, insect-inspired computer vision, morphological computation, emotional AI and much more. Sasha Trapani, a student from St Mary's School, attended the event and here summarises the presentations.
Speaker: Robert Bleischwitz, PhD Candidate, Southampton UniversityWhen we ride a bicycle on a bumpy terrain, we would stand on the pedals and use the knee as a spring and a damper to maintain stability. If the bicycle goes too out of balance, the rider would simply jump off without trying to control till the end. We would change the knee and hip stiffness when we walk in a beach as opposed to walking inside a building. When we are required to estimate the weight of a ball, we toss it up and down several times with slightly varying joint stiffness. When we are required to probe a soft tissue using a finger, people use various finger stiffness control strategies during probing. The dead body of a trout can swim against a stream of water. These examples pose us questions about how the brain and the body might be working together to solve complex problems to do with manipulation and locomotion. Though most robots still find it difficult to demonstrate human level motor skills, they provide a good paradigm to test many hypotheses about manipulation and locomotion. In this talk I will discuss some experimental results that highlights the body’s role as a computing machine.
Technology Advancements: Mechanical hand designed to hold soft matter and tissue e.g. pulsating organs and finger designed for locating tumors or otherwise unseen health risksThe use of visual information for navigation is a universal strategy for sighted animals, amongst whom social insects are particular experts. The general interest in studies of insect navigation is in part due to their small brains; biomimetic engineers can take inspiration from elegant and parsimonious control solutions, while biologists look for a description of the minimal cognitive requirements for complex spatial behaviours. We take an interdisciplinary approach to studying visual guided navigation by combining behavioural experiments with modelling and robotics to understand how complex behaviour can emerge from the interaction of a simple sensory system and brain, interacting with innate behaviours all tuned to the natural habitat. In so doing, we show that an agent can robustly navigate without ever knowing where it is, without specifying when or what it should learn, nor requiring it to recognise specific objects, places routes or maps. This leads to an algorithm in which navigation is driven by familiarity detection rather than explicit recall, with sensory data specifying actions not locations. Route navigation is thus recast as a search for familiar views, allowing an agent to encode routes through visually complex worlds in a single layer neural network after a single training run. We suggest that this work is a specific example of a more general idea. By considering how animals directly acquire and use task-specific information through specialised sensors, behaviours and neural circuitry, complex problems can be solved by simple algorithms, an approach that has implications for both human and robotic navigation.
Technology Advancements: Navigating simply from visual memory and familiar surroundingsThat was a great experience! #reworkbio. Great questions, very interesting people, and a cool place. Thank you @teamrework! - @Morphcomp
Thanks to all @TeamReWork for the great #ReWorkBio @WeWork Aldgate! Fantastic & stimulating #AI & #BioRobotics discussion! - @AlexPotterLDN
Future AI, Deep Learning and Machine Learning events include the Deep Learning Summit in London in September and the Machine Intelligence Summit in New York in November.
You can view all upcoming events here.
Meetup Robotics Bio-inspired Soft Robotics Biomimicry Swarm robotics