Ever wondered what your dog sees? New tech brings us a step closer

3 Dec 20191.14k Views

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Image: © Lux2008/Stock.adobe.com

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

If you’ve ever wondered how an animal sees the world through its own eyes, technology could soon give us an answer.

It might come as no surprise that animals see the world a lot differently to humans, not just figuratively, but literally. Now, scientists from the University of Queensland and the University of Exeter have developed software that brings us closer to ‘seeing’ through the eyes of animals.

Such an ability has long been sought after in the scientific community, and now the Quantitative Colour Pattern Analysis (QCPA) is helping to bridge our gap in understanding. The researchers’ findings have been published to Methods in Ecology and Evolution.

“Most animals have completely different visual systems to humans, so – for many species – it is unclear how they see complex visual information or colour patterns in nature, and how this drives their behaviour,” said PhD candidate Cedric van den Berg, from the University of Queensland.

“The QCPA framework is a collection of innovative digital image processing techniques and analytical tools designed to solve this problem. Collectively, these tools greatly improve our ability to analyse complex visual information through the eyes of animals.”

Colour patterns have been key to understanding many fundamental evolutionary problems, the researchers said, such as how animals communicate with each other or hide from predators. While the role of combinations of these colours and pattern information has been well known for years, the available techniques to better understand them have been lacking, until now.

A field of bluebells from the perspective of a human (left) and a bee (right).

A field of bluebells from the perspective of a bee (left) and a human (right). Image: Jolyon Troscianko

Crossing boundaries

Dr Jolyon Troscianko of the University of Exeter, the study’s co-lead, said that because the QCPA uses digital photos, it can replicate what an animal may see in any habitat, including underwater.

“You can even access most of its capabilities by using a cheap smartphone to capture photos,” he said.

The technique was developed and tested over the course of four years, including the development of an online platform to provide researchers, teachers and students with user guides, tutorials and worked examples of how to use the tools.

Speaking of its importance, Dr Karen Cheney said: “The flexibility of the framework allows researchers to investigate the colour patterns and natural surroundings of a wide range of organisms, such as insects, birds, fish and flowering plants.”

“For example, we can now truly understand the impacts of coral bleaching for camouflaged reef creatures in a new and informative way. We’re helping people – wherever they are – to cross the boundaries between human and animal visual perception. It’s really a platform that anyone can build on, so we’re keen to see what future breakthroughs are ahead.”

Colm Gorey is a senior journalist with Siliconrepublic.com

editorial@siliconrepublic.com