Categories
Uncategorized

BANANA visit in Torun (PL)

Prof. Aleksandra Kawala-Sterniuk from the BANANA team, Polish partner, visited on 14-16.06.2023 Torun.

She met Polish-PI and her team from BITSCOPE (Brain Integrated Tagging for Socially Curated Online Personalised Experiences) – prof. Veslava Osinska from the Nicolaus Copernicus University in Toruń (Poland).

BITSCOPE introduces a vision for enhancing social relationships through brain-computer interfaces (BCIs) in virtual environments. Instead of relying on explicit feedback like “likes,” our improved BCI technology captures attention, memorability, and curiosity by passively collecting neural data signatures. This data, refined through machine learning, can be used by recommender systems to create better online experiences. Our work focuses on developing a passive hybrid BCI (phBCI) that combines electroencephalography, eye tracking, galvanic skin response, heart rate, and movement to estimate the user’s mental state without interrupting their immersion. This approach improves signal quality, denoising capabilities, and adaptability to home environments. We leverage deep learning, geometrical approaches, and large datasets to address user state classification, including attention, curiosity, and memorability. These advancements are achieved through co-designed user-centered experiments.

BITSCOPE partners:

  • Dublin City University – Ireland (Coordinator)
  • Universitat Politècnica de València – Spain
  • Centre de Recherche Inria Bordeaux – Sud-Ouest – France
  • Nicolaus Copernicus University – Poland

She and Dariusz Mikolajewski (Co-PI, OUT) also had a pleasure to meet prof. Dean J. Krusienski from VCU, leader of the ASPEN Lab.

Dean J. Krusienski, a Senior Member of IEEE, obtained his B.S., M.S., and Ph.D. degrees in electrical engineering from The Pennsylvania State University, University Park, PA, USA. He conducted his Postdoctoral Research at the Brain-Computer Interface Laboratory, Wadsworth Center of the New York State Department of Health. Currently, he holds the position of Professor and Graduate Program Director of biomedical engineering at Virginia Commonwealth University (VCU), Richmond, VA, USA. Additionally, he directs the Advanced Signal Processing in Engineering and Neuroscience (ASPEN) Laboratory at VCU. His research interests encompass biomedical signal processing, machine learning, brain-computer interfaces, and neural engineering.

Aleksandra and Dariusz also spoke to prof. Wlodzislaw Duch.

Wlodzislaw Duch is a distinguished figure in neuroinformatics and artificial intelligence. He heads the Neurocognitive Laboratory at the Center of Modern Interdisciplinary Technologies and leads the Neuroinformatics and Artificial Intelligence group at the University Centre of Excellence Dynamics, Mathematical Analysis, and Artificial Intelligence. With a Ph.D. in theoretical physics/quantum chemistry and a D.Sc. in applied math, Duch has made significant contributions to the field. He has held prestigious positions, including the President of the European Neural Networks Society executive committee and fellowships in renowned international associations. Duch has an extensive publication record, authored books, and served on the editorial boards of numerous journals. Additionally, he has held visiting professor positions at esteemed institutions worldwide.

We are looking forward for further cooperation! 🙂

Categories
Uncategorized

AI- and ML-based methods in BCI systems


Brain-computer interfaces (BCIs) enable direct and bidirectional communication between the human brain and computers. The analysis and interpretation of brain signals, which provide valuable information about mental state and brain activity, pose challenges due to their non-stationarity and vulnerability to various interferences. Consequently, research in the BCI field emphasizes the integration of artificial intelligence (AI), particularly in five key areas: calibration, noise reduction, communication, mental state estimation, and motor imagery. The utilization of AI algorithms and machine learning has shown great promise in these applications, primarily because of their capacity to predict and learn from past experiences. As a result, implementing these technologies within medical contexts can provide more accurate insights into the mental state of individuals, mitigate the effects of severe illnesses, and enhance the quality of life for disabled patients.

New paper regarding Brain-Computer Interfaces published:

Barnova, K., Mikolasova, M., Kahankova, R. V., Jaros, R., Kawala-Sterniuk, A., Snasel, V., … & Martinek, R. (2023). Implementation of artificial intelligence and machine learning-based methods in brain-computer interaction. Computers in Biology and Medicine, 107135.

https://www.sciencedirect.com/science/article/pii/S0010482523006005

Categories
Uncategorized

Inferring Emotional Responses via fNIRS Neuroimaging

Information retrieval (IR) relies on a general notion of relevance, which is used as the principal foundation for ranking and evaluation methods. However, IR does not account for more a nuanced affective experience. In a recent paper we have published, we consider the emotional response decoded directly from the human brain as an alternative dimension of relevance.

We report an experiment covering seven different scenarios in which we measure and predict how users emotionally respond to visual image contents by using functional near-infrared spectroscopy (fNIRS) neuroimaging on two commonly used affective dimensions: valence (negativity and positivity) and arousal (boredness and excitedness). Our results show that affective states can be successfully decoded using fNIRS, and utilized to complement the present notion of relevance in IR studies.

This work will be presented at SIGIR’23 in Taipei, Taiwan. SIGIR is the flagship conference on Information Retrieval.

REFERENCE

Tuukka Ruotsalo, Kalle Mäkelä, Michiel Spapé and Luis A. Leiva. Affective Relevance: Inferring Emotional Responses via fNIRS Neuroimaging. Proc. SIGIR’23. https://doi.org/10.1145/3539618.3591946

Categories
Uncategorized

Prof. Jacek Gwizdka in Luxembourg

This week we hosted Prof. Jacek Gwizdka from The University of Texas at Austin, USA, where he directs the Information eXperience Lab (IX Lab).

Prof. Gwizdka is one of the pioneers of Neuro-Information Science. He studies human-information interaction and retrieval and applies cognitive psychology and neuro-physiological methods to understand information search and improve search experience.

He gave an interesting seminar talk titled “Neuro-physiological evidence as a basis for understanding human-information interaction”. He gave an overview of several projects he worked on which demonstrate the use of eye-tracking and EEG for inferring information relevance.

We had very productive meetings! We hope to materialize our research ideas in a future academic paper(s).

Categories
Uncategorized

How do we recall known faces?

Visual recognition requires inferring the similarity between a perceived object and a mental target. However, a measure of similarity is difficult to determine when it comes to complex stimuli such as faces. Indeed, people may notice someone “looks like” a familiar face, but find it hard to describe on the basis of what features such a comparison is based. Previous work shows that the number of similar visual elements between a face pictogram and a memorized target correlates with the P300 amplitude in the visual evoked potential. Here, we redefine similarity as the distance inferred from a latent space learned using a state-of-the-art generative adversarial neural network (GAN). A rapid serial visual presentation experiment was conducted with oddball images generated at varying distances from the target to determine how P300 amplitude related to GAN-derived distances.

The results showed that distance-to-target was monotonically related to the P300, showing perceptual identification was associated with smooth, drifting image similarity. Furthermore, regression modeling indicated that while the P3a and P3b sub-components had distinct responses in location, time, and amplitude, they were similarly related to target distance. The work demonstrates that the P300 indexes the distance between perceived and target image in smooth, natural, and complex visual stimuli and shows that GANs present a novel modeling methodology for studying the relationships between stimuli, perception, and recognition.

REFERENCE

de la Torre‐Ortiz, C.; Spapé, M.; Ruotsalo, T. (2023). The P3 indexes the distance between perceived and target image. Psychophysiology, e14225. https://doi.org/10.1111/psyp.14225

Categories
Uncategorized

We hosted Prof. Aleksandra Kawala in Luxembourg

This week the Luxembourg team has hosted our Polish partner, Prof. Aleksandra Kawala-Sterniuk from Opole University of Technology. We did good progress on the project and discussed lots of ideas to be developed in the upcoming months.

Looking forward to visiting Poland!

Categories
Uncategorized

CHIST-ERA Seminar 2023

4.-5. April 2023 – Bratislava, Slovakia

All researchers funded by Chist-ERA were brought together in Bratislava! Including BANANA project team members.

The event was fantastic and we could share our experiences with other BCI-call projects under under Chist-ERA IV “Advanced Brain-Computer Interfaces for Novel Interactions (BCI)“:

GENESIS, BISTSCOPE, ReHaB and BANANA!

We have presented poster, presentation and me (Aleksandra Kawala-Sterniuk) also co-chaired (with Hakim Si-Mohammed) session. regarding our call!

Categories
Uncategorized

Contradicted by the Brain

We investigated inferring individual preferences and the contradiction of individual preferences with group preferences through direct measurement of the brain. We report an experiment where brain activity collected from 31 participants produced in response to viewing images is associated with their self-reported preferences. First, we show that brain responses present a graded response to preferences, and that brain responses alone can be used to train classifiers that reliably estimate preferences. Second, we show that brain responses reveal additional preference information that correlates with group preference, even when participants self-reported having no such preference.

Our analysis of brain responses carries significant implications for researchers in general, as it suggests an individual’s explicit preferences are not always aligned with the preferences inferred from their brain responses. These findings call into question the reliability of explicit and behavioral signals. They also imply that additional, multimodal sources of information may be necessary to infer reliable preference information.

REFERENCE

K. M. Davis, M. Spapé and T. Ruotsalo. Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing. IEEE Transactions on Affective Computing, https://doi.org/10.1109/TAFFC.2022.3225885

Categories
Uncategorized

Gustav software

Temporal synchronization of behavioral and physiological signals collected through different devices (and sometimes through different computers) is a longstanding challenge in HCI, neuroscience, psychology, and related areas. Previous research has proposed to synchronize sensory signals using (1) dedicated hardware; (2) dedicated software; or (3) alignment algorithms. All these approaches are either vendor-locked, non-generalizable, or difficult to adopt in practice.

We propose a simple but highly efficient alternative: instrument the stimulus presentation software by injecting supervisory event-related timestamps, followed by a post-processing step over the recorded log files. Armed with this information, we introduce Gustav, our approach to orchestrate the recording of sensory signals across devices and computers. Gustav ensures that all signals coincide exactly with the duration of each experiment condition, with millisecond precision.

Gustav injects a supervisory timing signal that helps orchestrating the experiment conditions across devices and computers, from simple (a) to complex (c) setups.

Gustav is publicly available as open source software: https://gitlab.uni.lu/coin/gustav/

Reference

Kayhan Latifzadeh, Luis A. Leiva. Gustav: Cross-device Cross-computer Synchronization of Sensory Signals. In Adjunct Proc. UIST, 2022. https://dl.acm.org/doi/10.1145/3526114.3558723

Categories
Uncategorized

BANANA presents new work at IEEE CVPR

The BANANA team presents a new research paper on editing images by steering GAN models directly by observing human brain activity via EEG. More information available here: https://openaccess.thecvf.com/content/CVPR2022/html/Davis_Brain-Supervised_Image_Editing_CVPR_2022_paper.html