Partnership Announcement

Insects to Automation: Learning robust efficient autonomy from nature


Most of the talk these days around how to improve AI systems is really about data, and how to get as much of it as possible.

The question usually asked is: how can the creators of large language models or autonomous driving systems gather enough training data to make their systems smarter and more adaptable to different situations?

It’s the same in robotics automation, where collecting enough training data to teach systems to react to any scenario is often the name of the game.

This can be slow and painstaking work, and requires high levels of computation power to process all of the data the system learns from.

A better solution comes from the simplicity and efficiency of insects.

Consider how migrating butterflies travel thousands of miles to specific locations using their inherent navigation capabilities, how bees and termites build complex structures; how bees and ants can live in sophisticated social hierarchies or how swarming insects manage to avoid colliding with each other.

They can be incredible at learning, too.

A recent study found that ants can learn a complex route after experiencing it just once. This rapid learning seems to be helped by inbuilt behaviors such as the learning flights we see in bees, where they loop around places of importance. Instead of flying around and creating loads of data at random, their behavior makes them more efficient learners.

Insects behave in incredibly intelligent ways despite having very simple brain structures.

Breakthroughs in neuroscience in recent years mean it’s possible to map the entire brain and nervous system of an insect in incredible detail, something that is just not possible yet in larger animals that are usually considered to be ‘smarter’.

If we want to develop robots that can be automated in complex ways while using less power and less data, insects are a great place to look for inspiration.

Computer vision, inspired by insects

Think about how hard it is to swat a fly that just landed so gracefully on your favorite cake. Insect’s incredible spatial awareness allows them to navigate over long-ranges and across conditions, avoid collisions, and identify and interact with complex objects.

Look deeply into the way insects process what they see, and it’s clear they have developed a ‘shortcut’ that lets them process information from their eyes more quickly and efficiently than humans or other animals.

And by processing information so quickly, they can make decisions more quickly, helping insects in a swarm avoid one another.

Understanding this shortcut in insects’ brains can also help us produce far more efficient depth perception.

It’s important for autonomous robots to know how far away things are, which has led to the increasing popularity of expensive and bulky technologies like LiDAR.

But if we understand an insect’s ‘visual flow’, it opens the door to using low-cost cameras instead of having to install lasers in robots to help them keep their distance from other objects.

These cameras don’t need to be incredibly high-resolution, because neither are insects’ eyes. They can’t see in great detail, but they make up for it with a very wide field of view.

To understand why this is important, imagine looking at a tree in the summer. If you can see it in detail, you’ll know that it’s covered in green leaves. But look at it in winter and it might have no leaves at all, be covered in snow, and appear to be a completely different shape.

To an insect, however, it might just look like a big blob in the same place no matter the time of year.

If you’re a human wanting to have a rich understanding of the world around you, it’s good to know how different the tree looks. But if you want a robot simply to recognize where it is based on what it sees, the insect’s view can be far more clear and useful, avoiding the need to process irrelevant information.

Similarly, changing light conditions can make a location look different as the day goes on. Build computer vision to think too much like a human and you might find yourself having to create complex computation to account for all the ways one place can look different.

But if you strip things back to the simplicity of an insect brain, you can find ways to make locations look the same to a robot, no matter what the current light conditions are.

 

Why insect-inspired automation works so well

By drawing inspiration from insects, automation can also use far less data and processing power than alternative approaches.

Current warehousing robots typically rely on laser scanners and high-resolution cameras to map their surroundings which requires a lot of processing and storage – often done offline at a data center.   In contrast, insects navigate over large distances using a brain smaller than a pin-head.

The secret to that efficiency is the combination of sensing and processing processes that have evolved in nature.  Rather than capturing as much data as possible, evolution has tuned animal visual systems to filter out all but the most useful information. This means that the deep brain areas don’t have to wade through masses of data, but can instead focus on their own task – making smart decisions. An approach that is far more efficient than today’s data-centric solutions and is why insects still outperform today’s robots.

The key lesson is that collecting the right data rather than more data is far more simple and efficient.

 

Unlocking the future of automation

From walking, to eating, to work and play, movement plays a huge part in almost everything humans do. When automated movement is easy to achieve at low cost, the ways that movement can be deployed to solve problems and open up new opportunities could be near limitless.

At Opteran, that’s the future we’re working towards as we continue to draw inspiration from some of nature’s greatest biological feats to push forward what robots are capable of.

 

Dr Michael Mangan
VP of Research, Opteran

Further reading

news04.12.23
Dr Mike Mangan has been awarded a prestigious UKRI Future Leaders Fellowship to advance our research into Natural Intelligence brain algorithms
Read now