Inferring Emotional Responses via fNIRS Neuroimaging

Information retrieval (IR) relies on a general notion of relevance, which is used as the principal foundation for ranking and evaluation methods. However, IR does not account for more a nuanced affective experience. In a recent paper we have published, we consider the emotional response decoded directly from the human brain as an alternative dimension of relevance.

We report an experiment covering seven different scenarios in which we measure and predict how users emotionally respond to visual image contents by using functional near-infrared spectroscopy (fNIRS) neuroimaging on two commonly used affective dimensions: valence (negativity and positivity) and arousal (boredness and excitedness). Our results show that affective states can be successfully decoded using fNIRS, and utilized to complement the present notion of relevance in IR studies.

This work will be presented at SIGIR’23 in Taipei, Taiwan. SIGIR is the flagship conference on Information Retrieval.


Tuukka Ruotsalo, Kalle Mäkelä, Michiel Spapé and Luis A. Leiva. Affective Relevance: Inferring Emotional Responses via fNIRS Neuroimaging. Proc. SIGIR’23.


Prof. Jacek Gwizdka in Luxembourg

This week we hosted Prof. Jacek Gwizdka from The University of Texas at Austin, USA, where he directs the Information eXperience Lab (IX Lab).

Prof. Gwizdka is one of the pioneers of Neuro-Information Science. He studies human-information interaction and retrieval and applies cognitive psychology and neuro-physiological methods to understand information search and improve search experience.

He gave an interesting seminar talk titled “Neuro-physiological evidence as a basis for understanding human-information interaction”. He gave an overview of several projects he worked on which demonstrate the use of eye-tracking and EEG for inferring information relevance.

We had very productive meetings! We hope to materialize our research ideas in a future academic paper(s).


We hosted Prof. Aleksandra Kawala in Luxembourg

This week the Luxembourg team has hosted our Polish partner, Prof. Aleksandra Kawala-Sterniuk from Opole University of Technology. We did good progress on the project and discussed lots of ideas to be developed in the upcoming months.

Looking forward to visiting Poland!


Gustav software

Temporal synchronization of behavioral and physiological signals collected through different devices (and sometimes through different computers) is a longstanding challenge in HCI, neuroscience, psychology, and related areas. Previous research has proposed to synchronize sensory signals using (1) dedicated hardware; (2) dedicated software; or (3) alignment algorithms. All these approaches are either vendor-locked, non-generalizable, or difficult to adopt in practice.

We propose a simple but highly efficient alternative: instrument the stimulus presentation software by injecting supervisory event-related timestamps, followed by a post-processing step over the recorded log files. Armed with this information, we introduce Gustav, our approach to orchestrate the recording of sensory signals across devices and computers. Gustav ensures that all signals coincide exactly with the duration of each experiment condition, with millisecond precision.

Gustav injects a supervisory timing signal that helps orchestrating the experiment conditions across devices and computers, from simple (a) to complex (c) setups.

Gustav is publicly available as open source software:


Kayhan Latifzadeh, Luis A. Leiva. Gustav: Cross-device Cross-computer Synchronization of Sensory Signals. In Adjunct Proc. UIST, 2022.


Hello world!

We are happy to announce that project BANANA has officially started today. We look forward to advancing basic and applied research on Brain-Computer Interfaces.

You can follow this blog to keep an eye on our progress and see how the project evolves. Stay tuned!