A Summary of Technology Advancing the Bio-Inspired Robotics Revolution


At the Bio-Inspired Robotics Meetup held in London this week we heard a series of talks exploring the latest technical advancements and applications of bio-inspired robotics. Speakers covered membrane wings, insect-inspired computer vision, morphological computation, emotional AI and much more. Sasha Trapani, a student from St Mary's School, attended the event and here summarises the presentations. 

Speaker: Robert Bleischwitz, PhD Candidate, Southampton University 

Topic: Flexible Wings and Ground-effect for High-performance Drones 

Bio-inspired flexible membrane wings of bats offer performance benefits for Mirco-Air-Vehicles (MAVs) by allowing dynamic fluid-structure interaction. The usage of MAVs with flexible wings in ground-effect could be one further operational option to combine flexibility related increased lift and stall angle with large efficiency enhancement in close proximity to the ground. High-speed and time-synchronised wind tunnel measurements of loads, deformations and the flow reveal the coupling physics of membrane wings which are necessary to enhance wing performance, offering the unique ability to benefit from separated flow conditions for high turn rates, increased payload and large performance windows.

Technology Advancements: Using a membrane wing instead of a rigid wing increases the speed and stability of the MAV 

Applications for Business: Defence and Security micro-vehicles 

Key Take-away: By using membrane wings instead of rigid wings the efficiency of the micro-vehicle remains the same, whilst increasing the lift and useful aerodynamics of the vehicle. Membrane vibrations are also known to enhance leading edge vortex shedding, forming large roll up vortices of low pressure content, which can significantly contribute to the total lift enhancement at high incidences. It is important to remember that membrane wings are not necessary for larger air-vehicles and are only essential for micro-air-vehicles. 

Speaker: Thrish Nanayakkara, Senior lecturer, King’s collage London

Topic: Morphological Computation of Perception and Action 

When we ride a bicycle on a bumpy terrain, we would stand on the pedals and use the knee as a spring and a damper to maintain stability. If the bicycle goes too out of balance, the rider would simply jump off without trying to control till the end. We would change the knee and hip stiffness when we walk in a beach as opposed to walking inside a building. When we are required to estimate the weight of a ball, we toss it up and down several times with slightly varying joint stiffness. When we are required to probe a soft tissue using a finger, people use various finger stiffness control strategies during probing. The dead body of a trout can swim against a stream of water. These examples pose us questions about how the brain and the body might be working together to solve complex problems to do with manipulation and locomotion. Though most robots still find it difficult to demonstrate human level motor skills, they provide a good paradigm to test many hypotheses about manipulation and locomotion. In this talk I will discuss some experimental results that highlights the body’s role as a computing machine.

Technology Advancements: Mechanical hand designed to hold soft matter and tissue e.g. pulsating organs and finger designed for locating tumors or otherwise unseen health risks 

Applications for Business: Surgeons can use this as assistance whilst performing surgery

Speaker: Andy Philippides, Reader, CCNR, Sussex University

Topic: Insect-Inspired Robotic Navigation 

The use of visual information for navigation is a universal strategy for sighted animals, amongst whom social insects are particular experts. The general interest in studies of insect navigation is in part due to their small brains; biomimetic engineers can take inspiration from elegant and parsimonious control solutions, while biologists look for a description of the minimal cognitive requirements for complex spatial behaviours. We take an interdisciplinary approach to studying visual guided navigation by combining behavioural experiments with modelling and robotics to understand how complex behaviour can emerge from the interaction of a simple sensory system and brain, interacting with innate behaviours all tuned to the natural habitat. In so doing, we show that an agent can robustly navigate without ever knowing where it is, without specifying when or what it should learn, nor requiring it to recognise specific objects, places routes or maps. This leads to an algorithm in which navigation is driven by familiarity detection rather than explicit recall, with sensory data specifying actions not locations. Route navigation is thus recast as a search for familiar views, allowing an agent to encode routes through visually complex worlds in a single layer neural network after a single training run. We suggest that this work is a specific example of a more general idea. By considering how animals directly acquire and use task-specific information through specialised sensors, behaviours and neural circuitry, complex problems can be solved by simple algorithms, an approach that has implications for both human and robotic navigation.

Technology Advancements: Navigating simply from visual memory and familiar surroundings 

Applications for Business: Small rovers, UAVs and GPS

Key Take-away: Ants, Bees and Wasps all live in colonies and are trained to navigate by recognising areas and familiar surroundings. By doing this simple task it is not necessary for the Ant or Bee to really think but only to sense familiarity of the area and recall images from it’s memory. Robots could work in the same way and can learn or be programmed to do so. This would make the process of navigation easier for the robots as they would simply have to recall visuals from their surroundings and find the familiarities of the area.

  • Rupert Young, Director, Perceptual Robots 
  • Patrick Levy Rosenthal, CEO, Emoshape LLC
  • Helmut Hauser, Lecturer in Robotics, University of Bristol 

  • Topic: Future of Bio-Inspired Robotics & Industry Adoption 
    Many questions were raised around the ethics of A.I – a need for a universal agreement on what prompts positive and negative states: 
    • Flexible robotics is crucial for building machines that are able to work in unpredictable environments. It is important to build robots that can adapt and survive – Rupert Young 
    • Emotional Robots can be widely used in care, hospitality and retail. Stronger emotional bonds between humans and robots mean more customer engagement and therefore higher spending – Patrick Levy Rosenthal
    • Robotics as an industry should move away from engineering and more into biology. More similarities between humans and robots, with the software controlling the machines rapidly advancing into brain bio-mimicry e.g. Deep Learning – Helmut Hauser 
    Our key takeaway from the evening: the bio-inspired robotics field is advancing rapidly on both the hardware and the software sides. Industry adaptation is slower than other fields, but largely it is due to the “hidden” biomimicry roots of certain technologies and products. Europe is definitely a hub for this field of research, and we are excited to see how the industry develops over the coming years.

    You can view a social media summary of the meetup here


    That was a great experience! #reworkbio. Great questions, very interesting people, and a cool place. Thank you @teamrework! - @Morphcomp

    Thanks to all @TeamReWork for the great #ReWorkBio @WeWork Aldgate! Fantastic & stimulating #AI & #BioRobotics discussion! - @AlexPotterLDN

    Future AI, Deep Learning and Machine Learning events include the Deep Learning Summit in London in September and the Machine Intelligence Summit in New York in November.  

    You can view all upcoming events here

    Meetup Robotics Bio-inspired Soft Robotics Biomimicry Swarm robotics


      As Featured In


      Partners & Attendees

      Rbc research.001
      Mit tech review.001
      Kd nuggets.001
      Maluuba 2017.001
      This website uses cookies to ensure you get the best experience. Learn more