Convolutional neural networks can be tricked by the same visual illusions as people

A convolutional neural network is a type of artificial neural network in which the neurons are organized into receptive fields in a very similar way to neurons in the visual cortex of a biological brain. Today, convolutional neural networks (CNNs) are found in a variety of autonomous systems (for example, face detection and recognition, autonomous vehicles, etc.). This type of network is highly effective in many artificial vision tasks, such as in image segmentation and classification, along with many other applications.

Convolutional networks were inspired by the behavior of the human visual system, particularly its basic structure formed by the concatenation of compound modules comprising a linear operation followed by a non-linear operation. A study published in the advanced online edition of the journal Vision Research examines the phenomenon of visual illusions in convolutional networks compared to their effect on human vision. A study by Alexander Gómez Vila, Adrian Martín, Javier Vázquez-Corral and Marcelo Bertalmío, members of the Department of Information and Communication Technologies (DTIC) with the participation of the researcher Jesús Malo of the University of Valencia.

“Because of this connection of CNNs with our visual system, in this paper we wanted to see if convolutional networks suffer from similar problems to our visual system. Hence, we focused on visual illusions. Visual illusions are images that our brain perceives differently from how they actually are,” explains Gómez Vila, first author of the study.

In their study, the authors trained CNNs for simple tasks also performed by human vision, such as denoising and deblurring. What they observed is that these CNNs trained under these experimental conditions are also “deceived” by brightness and color visual illusions in the same way that visual illusions deceive humans.

Gómez Villa says, “For our work we also analyze when such illusions cause responses in the network that are not as physically expected, but neither do they match with human perception,” that is to say, cases in which CNNs obtain a different optical illusion than the illusion that humans would perceive.

The results of this study are consistent with the longstanding hypothesis that considers low-level visual illusions as a by-product of the optimization to natural environments (that a human sees in their everyday). Meanwhile, these results highlight the limitations and differences between the human visual system and CNNs artificial neural networks.

Artificial Intelligence also has illusory perceptions

More information:
A. Gomez-Villa et al, Color illusions also deceive CNNs for low-level vision tasks: Analysis and implications, Vision Research (2020). DOI: 10.1016/j.visres.2020.07.010

Provided by
Universitat Pompeu Fabra – Barcelona

Citation:
Convolutional neural networks can be tricked by the same visual illusions as people (2020, November 23)
retrieved 23 November 2020
from https://techxplore.com/news/2020-11-convolutional-neural-networks-visual-illusions.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Access the original article
Subscribe
Don't miss the best news ! Subscribe to our free newsletter :