Misha Sra
Massachusetts Institute of Technology
sra@media.mit.edu
Bio
Misha Sra received her PhD (2018) from the MIT Media Lab working with Prof. Pattie Maes where she is currently a Research Affiliate. Her work asks the question: what if the whole body were an interface? Misha’s systems fall into two main categories: (1) systems that give users access to information by means of their proprioceptive kinesthetic and tactile senses and (2) systems that increase immersion in virtual and mixed reality through implicit body-centric interfaces. Misha’s work is published at top-tier HCI and VR conferences. She has received an ACM CHI Best Paper award, an ACM VRST best paper award, a Silver Award in the annual Edison Awards global competition, and other honorable mentions. Her work also captured the interest of media such as MIT Technology Review UploadVR Discovery Channel and Business Insider. From 2014-2015, she was a Robert Wood Johnson Foundation wellbeing research fellow at the MIT Media Lab.
Perceptual Engineering
Perceptual Engineering
How can new technologies allow users to interact with digital content in the most direct and personally meaningful way? This question has driven interaction design for decades. If we think about the evolution of computing, interaction has gone from punch cards to mouse and keyboard to touch and voice. Similarly, devices have become smaller and closer to the user. More recently, wearables and head-mounted devices have brought computing into constant physical contact with the user’s body. With every transition interaction has become more direct and the things people can do have become more personal. The main question that drives my research is: how can interactions and devices become even more direct and personal? The goal of my work is to create systems that use the entire body for input and output and automatically adapt to each user’s unique state and context. I call my concept “perceptual engineering” — i.e., immersive systems that alter the user’s perception and influence or manipulate it in subtle ways. For example, they modify a user’s sense of space, place, balance, and orientation, manipulate their visual attention, and more, all without the user’s explicit awareness in order to guide the interactive experience in an effortless way. My research explores the use of cognitive illusions to manage a user’s attention, breathing, and actions for direct interaction, machine learning for automatic virtual-world generation, embodiment for remote collaboration, tangible interactions for play augmentation, and galvanic vestibular stimulation for reducing nausea and guiding users in immersive experiences. My perceptual engineering approach has shown to (1) increase the user’s sense of presence in VR/MR, (2) provide a novel eyes-, ears-, and hands-free way to communicate with the user through proprioception and other senses, and (3) serve as a platform to question the boundaries of our sense of agency and trust.”