We are interested in all areas of visual processing, but our current projects focus on mechanisms for attention and tracking of moving features, and detection of moving patterns in dim light. Despite their small size, the amazing brain of insects are surprisingly sophisticated. We are often surprised by our own results and have found several properties that we assumed were unique to the human brain right there inside the tiny brain of dragonflies and other insects!
An implicit assumption of our research is this: Insects invest more in vision than other animals. They have been around for hundreds of millions of years. So the solutions that their brain has evolved may be among the most efficient for solving complex problems. What happens to the abundance of information collected by their huge eyes? How does the brain extract the relevant features from complex scenes?
We answer these questions by studying the physiology of the brain directly, using in-vivo recording with 'nano-electrodes', around 50nm wide at the tip (1500 times thinner than human hair). We place these deep into the brain of large insects (dragonflies, hawkmoths or hoverflies). This lets us record physiological responses of single neurons while displaying moving objects on a high speed computer display (at up to 360 frames per second). By varying the patterns we figure out what the neuron responds to, and test ideas for how it works. We can then inject neurons with fluorescent tracers to see precisely how and where they sit in the brain, and trace their numerous connections with other neurons.
Using all of this information allows us to reconstruct visual pathways from the eye right the way through to higher order parts of the brain. We can then construct conceptual models for the way the brain process moving features. We also collaborate with engineers in hi-tech industry and computer scientists to develop robust artificial vision systems. We have successfully developed novel computational systems and even silicon chips that mimic motion processing by insect brains and the ability to track moving features. Applications for this technology include guidance systems for robots and collision avoidance sensors for super smart cars. We've also been collaborating with neural stem cell researchers to develop an interface between computer chips and living neurons that might one day allow us to develop bionic and prosthetic devices that connect our motion sensors directly to the brain of blind patients!