17/Mar/2021 16:15 - 17:00

language of event coccinella italiano

AI OpenLabs

by prof. Luca Turchet and prof. Nicola Conci

The main teaching and research activities conducted in the lab will be presented, along with the equipment and demos of the technologies developed by the team of researchers composing the lab.

Laboratory description

The Multisensory Interactions educational Laboratory focuses on the study of systems that interact with users by leveraging multiple sensory modalities, such as vision, audition and haptics.
We engage undergraduate and graduate students as well as partner companies in using advanced methodologies of the fields of human-computer interaction, artificial intelligence and human perception to develop interactive systems capable of understanding the behavior and emotions of users and communicating information to them in a satisfactory and effective manner.

We teach students to design, develop and evaluate proof of concepts prototypes of multisensory interactive systems. Students learn how to use sensors, actuators, microcontrollers, as well as real-time programming languages to detect the gestures of users and transform them into sounds, visual, and haptic stimuli, applying the knowledge of human perceptual processes.
The laboratory is equipped with infrared cameras, virtual and augmented reality systems, microphones, a surround sound system, voice-based interfaces, and wearable devices equipped with vibration motors such as bracelets, gloves, and shoes. We also have observational tools for the annotation and evaluation of multisensory prototypes and experiences.

The Laboratory collaborates with several companies in the sectors of Healthcare, Creative Industries, Virtual Reality, Internet of Things, Brain-Computer Interfaces, and Sport.