The ‘Green Brain Project’ combines computational neuroscience modelling, learning and decision theory, modern parallel computing methods, and robotics with data from state-of-the-art neurobiological experiments on cognition in the honeybee Apis mellifera.
These various methodologies are used to build and deploy a modular model of the honeybee brain describing detection, classification, and learning in the olfactory and optic pathways as well as multi-sensory integration across these sensory modalities.
Simply put, the goal of the Green Brain Project is to create a robot that thinks, sense, and acts like a honeybee! This is done by creating neuromimetic models and embodying it within flying robots.
Better Understanding of Cognitive Functions in Animals
It has been well established that the honeybee Apis mellifera has surprisingly advanced cognitive behaviours despite the relative simplicity of its brain when compared to vertebrates. These cognitively sophisticated behaviours are achieved despite the very limited size of the honeybee brain (on the order of 10^6 neurons). In comparison, even rats or mice have brains on the order of 10^8 neurons. The greatly reduced scale and the experimental accessibility of the honeybee brain makes thorough neurobiological understanding and subsequent biomimetic exploitation much more practical than with even the simplest vertebrate brain.
Modelling of the honeybee brain will focus on three brain regions:
- The system for olfactory sensing
- The system for visual sensing
- The mushroom bodies for multi-modal sensory integration
These systems are chosen as they exhibit complex cognitive behaviors essential to autonomous agents which are not currently understood.
As the trend develops toward the increasing use of UAVs, it becomes necessary to allocate and control them effectively. One of the biggest challenges in the area of UAVs is in developing algorithms that not only have good performance but are also adaptive and robust. This is essential for UAVs that need to perform in the real world which is ever changing and hardly certain. By extending existing models and techniques of honeybees, we plan to demonstrate sophisticated visual-based navigation and cognitive functions in a flying robotic platform.