Watch the world through different animals' eyes in this stunning high-tech footage
Cameras recorded footage in red, blue, green and UV channels simultaneously, with openly available software processing the raw footage and converting it into different kinds of "animal vision," showing us how bees, birds, mice and dogs might see the world.
A zebra swallowtail butterfly foraging on flowers as a honeybee would see it. (Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0)
Scientists have combined a new camera system with open-source software to generate stunning video clips of the world as different animals see it — including the specific colors they perceive.
From more intense reds to streaks of ultraviolet, the footage shows various settings in and around a garden environment, with some colors accentuated and others dulled depending on which animal's vision is being emulated.
The clip shows a zebra swallowtail butterfly (Protographium marcellus) foraging on flowers as a honeybee (Apis mellifera) would see it. The scientists published 12 videos in total showing how birds, bees, mice and dogs see the world.
To produce the videos, the researchers set up cameras to capture raw footage and later applied post-processing software on top to predict perceived colors in different species. This method, which they outlined in a paper published Jan. 23 in the journal PLOS Biology, is 92% accurate based on testing against conventional spectrophotometry techniques.
"We've long been fascinated by how animals see the world," Daniel Hanley, senior author of the study and an assistant professor of biology at George Mason University in Virginia, said in a statement. "Modern techniques in sensory ecology allow us to infer how static scenes might appear to an animal; however, animals often make crucial decisions on moving targets (e.g., detecting food items, evaluating a potential mate's display, etc.). Here, we introduce hardware and software tools for ecologists and filmmakers that can capture and display animal-perceived colors in motion."
Related: See 15 crazy animal eyes — rectangular pupils to wild colors
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
Species see the world differently in part due to photoreceptors in their eyes and the neural architecture of their brains. The eyes of dogs, for example, are structured similarly to people with red-green color blindness. Insects like honeybees, meanwhile, can see ultraviolet light, the scientists said in their paper.
To better understand how animals see the world, researchers have devised various methods to accurately reproduce the colors the animals see, but these techniques have only been capable of generating still images.
Spectrophotometry, for example, works by using object-reflected light to estimate what animals' photoreceptors pick up on. These methods have only produced still images so far, they can't infer spatial information and they are highly time-consuming, the scientists said. Meanwhile, multispectral photography, which relies on taking a series of photos in several wavelength ranges, trades accuracy for more spatial information — but this method works only on still objects.
To get around these limitations, researchers created this new system by acquiring commercially available Sony a6400 cameras and configuring them to record in four color channels — red, green, blue and ultraviolet — simultaneously.
Next, they affixed the cameras to a 3D-printed structure, comprising various pieces of photography equipment, including a modular cage, mounts for a beam splitter mirror and cone baffles (which minimize light leakage toward the camera).
This was the first step in a pipeline that began with capturing raw footage and ended in rendering the finished clips. To render the video in animal-perceived colors, the researchers applied the video2vision software — a set of transformation functions — to the raw footage. Then, they processed the data into "perceptual units," akin to photo filters, and fine-tuned each one based on our existing knowledge of the respective species' photoreceptors to accurately predict what each animal might be seeing.
Scientists and filmmakers who study animals can use this setup to capture and process their own footage, the researchers said. In particular, watching footage with these animal-vision filters applied can tell us more about how particular species interact with their environment and respond to stimuli.
Keumars is the technology editor at Live Science. He has written for a variety of publications including ITPro, The Week Digital, ComputerActive, The Independent, The Observer, Metro and TechRadar Pro. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. He is an NCTJ-qualified journalist and has a degree in biomedical sciences from Queen Mary, University of London. He's also registered as a foundational chartered manager with the Chartered Management Institute (CMI), having qualified as a Level 3 Team leader with distinction in 2023.