Categories
Uncategorized

How do we recall known faces?

Visual recognition requires inferring the similarity between a perceived object and a mental target. However, a measure of similarity is difficult to determine when it comes to complex stimuli such as faces. Indeed, people may notice someone “looks like” a familiar face, but find it hard to describe on the basis of what features such a comparison is based. Previous work shows that the number of similar visual elements between a face pictogram and a memorized target correlates with the P300 amplitude in the visual evoked potential. Here, we redefine similarity as the distance inferred from a latent space learned using a state-of-the-art generative adversarial neural network (GAN). A rapid serial visual presentation experiment was conducted with oddball images generated at varying distances from the target to determine how P300 amplitude related to GAN-derived distances.

The results showed that distance-to-target was monotonically related to the P300, showing perceptual identification was associated with smooth, drifting image similarity. Furthermore, regression modeling indicated that while the P3a and P3b sub-components had distinct responses in location, time, and amplitude, they were similarly related to target distance. The work demonstrates that the P300 indexes the distance between perceived and target image in smooth, natural, and complex visual stimuli and shows that GANs present a novel modeling methodology for studying the relationships between stimuli, perception, and recognition.

REFERENCE

de la Torre‐Ortiz, C.; Spapé, M.; Ruotsalo, T. (2023). The P3 indexes the distance between perceived and target image. Psychophysiology, e14225. https://doi.org/10.1111/psyp.14225

Categories
Uncategorized

Contradicted by the Brain

We investigated inferring individual preferences and the contradiction of individual preferences with group preferences through direct measurement of the brain. We report an experiment where brain activity collected from 31 participants produced in response to viewing images is associated with their self-reported preferences. First, we show that brain responses present a graded response to preferences, and that brain responses alone can be used to train classifiers that reliably estimate preferences. Second, we show that brain responses reveal additional preference information that correlates with group preference, even when participants self-reported having no such preference.

Our analysis of brain responses carries significant implications for researchers in general, as it suggests an individual’s explicit preferences are not always aligned with the preferences inferred from their brain responses. These findings call into question the reliability of explicit and behavioral signals. They also imply that additional, multimodal sources of information may be necessary to infer reliable preference information.

REFERENCE

K. M. Davis, M. Spapé and T. Ruotsalo. Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing. IEEE Transactions on Affective Computing, https://doi.org/10.1109/TAFFC.2022.3225885

Categories
Uncategorized

BANANA presents new work at IEEE CVPR

The BANANA team presents a new research paper on editing images by steering GAN models directly by observing human brain activity via EEG. More information available here: https://openaccess.thecvf.com/content/CVPR2022/html/Davis_Brain-Supervised_Image_Editing_CVPR_2022_paper.html