Embodiment

The Green Brain robots and lab were used for the embodiment of animal behaviour and on the development of biomimetic algorithms for navigation and control of UAVs.

On

Watch this video to learn more about our lab and our robots:

Work

The GB team set up the lab, built and tested our robots, and demonstrated some sophisticated visual-based navigation and cognitive functions through embodiment. Projects included navigation using the Green Brain models, flight tests of our models with the BeeBot quadcopter, developing a Ground Control Station, initial testing with the chemosensors, and more.


Robots

As platforms for the embodiment of the Green Brain, the robot team comprised of 3 quadcopters and 1 ground robot, each with unique abilities and configurations. All of the robots were used for testing and further development of the models. All of these platforms have a huge support community involvement in the development of open-source software.

GB - copter-set-up

 At this stage, we had three different quadcopters: a Parrot AR Drone, a Parrot BeBop Drone, and the BeeBot, a custom-built quadcopter.

The BeeBot, our primary robot used for Green Brain development, was made from an Aeroquad frame, an ArduPilot autopilot, and custom sensors. This reasonably small frame made it more conducive to flying indoors but also allowed it to support the necessary sensor payload required by the project. The unique configuration of sensors included dual wide-angle lens cameras and chemosensors, to mimic honeybee vision and olfaction.

The Parrot drones were pre-configured ready-to-fly quadcopters used for initial testing and demonstration of algorithms and our Green Brain model. It was been used by our undergraduates for their senior research projects.

Green Brain - the BoeBot

Our ground robot, BoeBot, was a simple two-wheeled robot controlled using an Arduino microcontroller. The BoeBot carried chemosensors as its primary sensors and was used for initial testing and the development of algorithms using them.

This great robot was easy to put together, with no soldering involved, which makes it perfect for hobbyists or for beginner roboticists. For this reason, you would often see the BoeBot at any of our outreach and public engagement events.


The Green Brain lab

We used a dedicated flying room in the Kroto Research Institute at the University of Sheffield.

All experiments were set up and managed using our off-board Ground Control Station. While the honeybee brain “only” has 1 million neurons, it still took quite a bit of computing power to use this to control a robot in real-time. Therefore, all brain computing was also done off-board using NVIDIA Tesla GPUs.

Additionally in this lab, we installed a Vicon Motion Tracking System for validation of our algorithms and models and as a safety measure.

You can get an inside look at the lab by visiting the ‘Green Brain Lab’ photo album on our Facebook page.