Lund University is celebrating 350 years. Read more on lunduniversity.lu.se

Menu

Javascript is not activated in your browser. This website needs javascript activated to work properly.
You are here

Visual neurophysiology

The insect brain as a model for understanding physiological mechanisms of motion processing
Some flying insects have a visual system more than 20% of their body weight: They invest more in vision than any other animal. This provides sophisticated abilities to discriminate moving objects and features. Combined with accessibility of their brain for physiological recordings, insects are an ideal model for understanding problems common to many visual systems, natural and man-made.

We are interested in all areas of visual processing, but our current projects focus on mechanisms for attention and tracking of moving features, and detection of moving patterns in dim light. Despite their small size, the amazing brain of insects are surprisingly sophisticated. We are often surprised by our own results and have found several properties that we assumed were unique to the human brain right there inside the tiny brain of dragonflies and other insects!

A picture of David O'Carroll in the lab with a dragonfly

An implicit assumption of our research is this: Insects invest more in vision than other animals. They have been around for hundreds of millions of years. So the solutions that their brain has evolved may be among the most efficient for solving complex problems. What happens to the abundance of information collected by their huge eyes? How does the brain extract the relevant features from complex scenes?

We answer these questions by studying the physiology of the brain directly, using in-vivo recording with 'nano-electrodes', around 50nm wide at the tip (1500 times thinner than human hair). We place these deep into the brain of large insects (dragonflies, hawkmoths or hoverflies). This lets us record physiological responses of single neurons while displaying moving objects on a high speed computer display (at up to 360 frames per second). By varying the patterns we figure out what the neuron responds to, and test ideas for how it works. We can then inject neurons with fluorescent tracers to see precisely how and where they sit in the brain, and trace their numerous connections with other neurons.

picture of fluorescent neurons in the dragonfly brain
Small-field neurons in the medulla of the dragonfly brain labelled with fluorescent dextran

Using all of this information allows us to reconstruct visual pathways from the eye right the way through to higher order parts of the brain. We can then construct conceptual models for the way the brain process moving features. We also collaborate with engineers in hi-tech industry and computer scientists to develop robust artificial vision systems. We have successfully developed novel computational systems and even silicon chips that mimic motion processing by insect brains and the ability to track moving features. Applications for this technology include guidance systems for robots and collision avoidance sensors for super smart cars. We've also been collaborating with neural stem cell researchers to develop an interface between computer chips and living neurons that might one day allow us to develop bionic and prosthetic devices that  connect our motion sensors directly to the brain of blind patients!

5 images showing models for motion processing by insects, from those based only on anatomy to silicon hardware in robots
Page Manager:

People involved

Collaborators