A new study from MIT and Harvard University suggests that the brains of the seemingly simple zebrafish are more sophisticated than previously thought. The researchers found that larval zebrafish can use visual information to create three-dimensional maps of their physical surroundings—a feat that scientists didn’t think was possible.
In the new study, the researchers discovered that zebrafish can move around environmental barriers while escaping predators. The findings suggest that zebrafish are “much smarter than we thought,” and could be used as a model to explore many aspects of human visual perception, the researchers say.
“These results show you can study one of the most fundamental computational problems faced by animals, which is perceiving a 3D model of the environment, in larval zebrafish,” says Vikash Mansinghka, a principal research scientist in MIT’s Department of Brain and Cognitive Sciences and an author of the new study.
Andrew Bolton, an MIT research scientist and a research associate at Harvard University, is the senior author of the new study, which appears today in Current Biology. Hanna Zwaka, a Harvard postdoc, and Olivia McGinnis, a recent Harvard graduate who is now a graduate student at the Oxford University, are the paper’s lead authors.
Mapping the environment
Since the 1970s, zebrafish have been used to study a variety of human diseases, including cancer, cardiovascular disease, and diabetes. One of the early pioneers of zebrafish research was Nancy Hopkins, currently an MIT professor emerita of biology, who discovered many of the genes involved in zebrafish embryonic development.
More recently, scientists have begun to explore the possibility of using zebrafish as a model of behaviors that involve sensory perception. Three years ago, Bolton led a study showing that zebrafish can accurately predict the trajectories of their prey based on the prey’s position and velocity.
During that study, Bolton accidentally dropped one of the dishes containing larval zebrafish and noticed that the fish immediately scattered in all directions. That led him to wonder, was their choice of escape path totally random, and would it be affected if there were obstacles in the way?
The ability to detect obstacles requires integration of multiple types of sensory input, and the ability to use that information to calculate the position of the obstacle relative to one’s own position in space. Humans and many other animals can do this, but it wasn’t thought that simpler organisms such as zebrafish could do it.
Instead, many neuroscientists believed that visual perception of zebrafish was similar to that of organisms such as the simple worm C. elegans. In those worms, light detected by photosensitive cells can trigger reflexive responses such as moving toward or away from the light.
To explore the question of whether zebrafish can create mental representations of their 3D environment, Bolton created an experimental setup where the fish would need to try to avoid an obstacle blocking one of their possible escape paths. The experiments were done in the lab of Florian Engert, a Harvard professor of molecular and cellular biology, who is also an author of the study.
Each fish was placed in a circular dish about 12 centimeters in diameter, where they could swim freely. When a metal rod was dropped onto the dish, creating a loud bang, the fish would immediately flee. The researchers first showed that if no barriers were present, the fish would randomly choose either the left or the right as an escape path.
Then, the researchers placed a 12-millimeter plastic barrier blocking one of the escape routes. When a barrier was in place, the researchers found that the fish almost always chose to escape in the direction with no barrier, as long as there was enough light for them to see it. Furthermore, the fish were more likely to try to avoid the barriers when they were closer, suggesting that they are also able to calculate the distance to the barriers.
The zebrafishes’ quick reaction time—about 10 milliseconds—suggests that the animals must “pre-compute” a map of the barrier location before they hear the sound. Conduction of visual information from the retina to the brain takes about 60 milliseconds in zebrafish, ruling out the possibility that the fish check for obstacles after hearing the loud bang.
“They can’t do the mapping in real-time, because the escape is too fast relative to the tap,” Bolton says. “They need to pre-map the environment before, just in case a predator or something mimicking a predator shows up.”
Modeling the brain
This kind of pre-mapping behavior has been seen in rodents and other mammals, but not in simpler vertebrates. The findings in zebrafish open up a new way to explore questions of how the brain creates models of the world, says Misha Ahrens, a senior group leader at the Howard Hughes Medical Institute Janelia Research Campus.
“This work shows beautifully how a small and deceptively simple-looking animal possesses remarkable behavioral and computational capabilities. They are not just input-output machines; instead, they possess a model of the world around them that is invisible to us until we carefully probe those internal models with a carefully designed trigger,” says Ahrens, who was not involved in the study.
Because the zebrafish brain is smaller and simpler than the mammalian brain, it can be more easily imaged and manipulated, down to the level of individual neurons. Previous researchers identified a single pair of neurons, known as Mauthner neurons, which appear to mediate the zebrafish response to the sound. This study’s neural circuit experiments found that visual input of the barrier excites the Mauthner neuron that induces escapes away from barrier.
The researchers now plan to explore what part of the zebrafish brain encodes representations of depth perception. Neuroscientists already have a good idea of how and where the mammalian brain maps two-dimensional places (in the superior colliculus, which is analogous to a zebrafish brain region called the optical tectum), but how the third dimension of depth is added is not well-understood.
“If, for example, we find the 3D representation in the larval zebrafish optical tectum, that would be a guide to where it might be in the superior colliculus or the visual pathways of mammals, including humans,” says Mansinghka, who leads the Probabilistic Computing Project at MIT’s Computer Science and Artificial Intelligence Laboratory.
Mansinghka also hopes that the new findings will help convince some cognitive and systems neuroscientists, who view zebrafish as too simple to be useful for their purposes, to consider it as a model with the potential to integrate many different approaches that scientists now use to study the brain.
“Historically, there has been a lot of divergence between people who study cells, people who study brain circuits, people who do imaging, people who study behavior, people who study cognition, and people who study computation,” he says. “It’s hard to do integrative research that addresses all those levels simultaneously, but here we may have shown that there is an organism that could be used to study perceptual computations at many different levels and connect it to the underlying neurons.”
More information:
Andrew D Bolton, Visual object detection biases escape trajectories following acoustic startle in larval zebrafish, Current Biology (2022). DOI: 10.1016/j.cub.2022.10.050. www.cell.com/current-biology/f … 0960-9822(22)01698-0
Provided by
Massachusetts Institute of Technology
This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.
Citation:
Study: Zebrafish are smarter than we thought (2022, November 18)