Social Brain Scanning
(August 2nd, 2016) Laboratory mice, rats or flies rarely stay still when researchers want to see their brains at work. Steven Buckingham reports on several methods that have been designed to overcome this challenge.
The rise of brain imaging reflects an increasing appreciation that the functioning of a nervous system can only be fully understood when whole networks are examined together. But one problem has severely limited all brain-imaging experiments – the need to keep the subject immobile. Often this is because it takes time to do the scans, and if the subject moves while data are being collected, this will introduce noise into the image, reducing its quality. It is like being in the early days of photography, when subjects had to stay as still as they could while the photographic film slowly responded to the light.
But there are many questions that can only really be properly addressed by allowing the subject to move while it is being scanned. What happens in brain structures when animals move around, feed, groom? What happens when animals interact with each other? The assumption is that you can get around these questions with carefully-designed experiments, but as experimentalists we are uncomfortable with inferring, when we should be observing. Happily, the past few years have seen some novel approaches that have allowed brains to be imaged while animals are actively engaged in doing what animals do.
One solution has been to get the animal to carry the imaging camera around with it. In 2008, Mark Schnitzer’s team at Stanford published a paper describing how they had built an epifluorescence microscope weighing 1.1g that can be inserted onto a mouse’s head. The camera contains three lenses and miniature gears for focussing, illuminating light and the image and are both delivered through optical fibres. The image is about 300µm diameter and about 3µm resolution. Schnitzer then injected a calcium-sensitive dye into the cerebellum of the mice and recorded them while they were freely moving around an arena. Action potentials could be seen working their way along the long axonal branches of Purkinje cells, and they were able to show that these spikes increase in frequency while the mice were moving around, and that the correlation between spikes in different cells increased when the mice exhibited tongue-licking. Both of these were controversial subjects that could not be resolved using existing methods.
The power of imaging freely-moving animals is multiplied when it is combined with optogenetics. Garret Stuber at the University of North Carolina at Chapel Hill combined optogenetics with an imaging approach similar to Mark Schnitzer. Stuber’s team was looking at the role of GABA-ergic neurones in the lateral hypothalamus and had shown, using optogenetic stimulation techniques, that these cells played a causal role in these behaviours. But they wanted to get a feel for the functional diversity between the cells, and to do this they needed to be able to record from lots of them while the mice were feeding. So they injected an inducible construct that directs expression of a calcium-sensitive dye in the neurones and recorded their activity during freely-feeding animals. They were able to show that pools of neurones specifically encoded locations associated with food, and that subpopulations encode either appetitive or consummatory behaviours, but not both.
But the Prize for Overcoming Challenges in Imaging Freely-Moving Animals (we may need a snappier name – Ed) goes to Takeo Katsuki at the University of California, San Diego, who has found a way of brain imaging freely-moving fruit flies. Katsuki’s approach was to remove the cuticle from the dorsal side of the flies’ heads and replace it with a transparent window. The flies were then allowed to wander around an arena while being tracked by a camera. This tracking camera then uses a closed-loop feedback system to direct an imaging camera where to look, so that the image of the fly is stabilised with respect to the imaging camera. Katsuki used flies genetically-modified to express a calcium-sensitive dye in certain olfactory neurones allowing them to observe activity in these neurones in response to odours. But Drosophila is famous for its very convenient genetic toolkit and a wealth of forward and reverse genetic data. Katsuki showed how their method can cash in on that wealth by using the fru mutant, in which male flies exhibit defective mating behaviour. The expression of the optical marker was directed to fru-expressing neurones, and Katsuki showed that activity in these cells is reduced during mating.
Whole-brain imaging is a high-investment approach, whether it is making special micro-cameras or complicated multi-animal arenas. What is important is that it opens up new ways of designing experiments and, consequently, new ways of thinking about brain function. We have been imaging single, restrained animals partly because we had to (methodological constraints) but also because of reductionist ways of thinking. By solving the first, we can now re-think the second.