I Feel the Earth Move (Under My Feet)

Haptic Interaction for Telepresence and Information Delivery
4 July 2019
July 4, 2019

Date&Time: July 4, 2019 - h. 12:00 - 13:00
Venue: Via Sommarive 5 - Polo Ferrari 1 (Povo, TN) - Room Garda

Speaker

  • Prof. Jeremy Cooperstock - McGill University, Department of Electrical and Computer Engineering

Abstract

While telepresence technologies have largely focused on video and audio, our lab's research emphasizes the added value that haptics, our sense of touch, brings to the experience. This is not only the case for hand contact, but equally true for the wealth of information we gain about the world, through our feet, as we walk about our everyday environments. We have focused extensively on understanding properties of haptic perception, and use this knowledge to develop technologies that improve our ability to render information haptically to users, in particular, for tasks in which visual attention is occupied. This talk illustrates several examples, including immersive telepresence exploration of a city, simulation of the multimodal experience of walking on ice and snow, and our efforts to lower the noise level of the hospital OR and ICU by replacing audio alarms with haptic rendering of patients' vital signs.

About the Speaker

Jeremy Cooperstock (Ph.D., University of Toronto, 1996) is a professor in the department of Electrical and Computer Engineering, a member of the Centre for Intelligent Machines, and a founding member of the Centre for Interdisciplinary Research in Music Media and Technology at McGill University. He directs the Shared Reality Lab, which focuses on computer mediation to facilitate high-fidelity human communication and the synthesis of perceptually engaging, multimodal, immersive environments. He led the development of the Intelligent Classroom, the world's first Internet streaming demonstrations of Dolby Digital 5.1, multiple simultaneous streams of uncompressed high-definition video, a high-fidelity orchestra rehearsal simulator, a simulation environment that renders graphic, audio, and vibrotactile effects in response to footsteps, and a mobile game treatment for amblyopia. Cooperstock's work on the Ultra-Videoconferencing system was recognized by an award for Most Innovative Use of New Technology from ACM/IEEE Supercomputing and a Distinction Award from the Audio Engineering Society. The research he supervised on the Autour project earned the Hochhausen Research Award from the Canadian National Institute for the Blind and an Impact Award from the Canadian Internet Registry Association, and his Real-Time Emergency Response project won the Gold Prize (brainstorm round) of the Mozilla Ignite Challenge. Cooperstock has worked with IBM at the Haifa Research Center, Israel, and the T.J. Watson Research Center in Yorktown Heights, New York, the Sony Computer Science Laboratory in Tokyo, Japan, and was a visiting professor at Bang & Olufsen, Denmark, where he conducted research on telepresence technologies as part of the World Opera Project. Cooperstock led the theme of Enabling Technologies for a Networks of Centres of Excellence on Graphics, Animation, and New Media (GRAND) and is an associate editor of the Journal of the AES. 

Contact: luca.turchet [at] unitn.it (Luca Turchet)