Online Conference Details

All paper sessions will be hosted through Zoom. Registered attendees will receive a Zoom webinar link to their registered email in the days leading up to the conference. All Zoom presentations will be recorded and hosted on the VRST 2020 Discord server immediately after the live sessions to offer asynchronous viewing.
All poster/demos will be presented through Discord supporting both synchronous live poster sessions via voice/video channels, and asynchronous viewing via text channels and videos. Discord will also be used as a communication channel.
You can join the VRST 2020 Discord server here:

https://discord.gg/vrst2020

Attendees and presenters are encouraged to sign up for Discord in advance of the conference if they do not already have an account. Clicking the server link above will prompt you to sign up if you do not have an account already. Please provide your Discord user ID during registration so we can add you to the server when it opens.
Guide to Getting Started with Discord

The VRST 2020 Discord server is now open. Registered attendees will be automatically added. If you have registered but are unable to access the discord server, please email chairs2020@vrst.acm.org

VRST 2020 Proceedings now available in the ACM Digital Library: https://dl.acm.org/doi/proceedings/10.1145/3385956

Keynote Speakers

  • Doug Bowman
  • Virginia Polytechnic Institute and State University, USA

Preparing for an Always-On Augmented Reality Future

Future AR glasses will support all-day use, making them applicable to everyday information access and interaction tasks. AR research has helped get us to this point, through technical advances such as inside-out vision-based tracking. But technical achievements alone are insufficient to ensure that always-on AR systems will be productive, usable, useful, and satisfying. We must also design effective methods for interacting with and managing AR content, and we must understand the effects of always-on AR, on both individuals and societies. In this talk, I will present a vision of future everyday AR uses, and discuss recent user experience research aimed at enabling this vision. Our research objectives include the design of interaction techniques that make information access through AR more convenient and usable than today's smartphones and smartwatches, while at the same time not distracting users from what's going on in the real world around them.
  • Mary Whitton
  • University of North Carolina at Chapel Hill, USA

Seeing and Touching: Hands in VR

Since the earliest days of virtual reality two of the biggest challenges have been enabling the user to interact naturally with elements in the virtual world and enabling the user to feel and touch objects in that world. In the real world, most people use their hands to initiate and control interactions and depend on the physical properties of objects to trigger the touching sensation. In an immersive virtual environment, we have only data from sensors and possibly passive haptics to use to simulate seeing our hands and touching solid objects. In this talk I’ll look back at research involving hands, haptics, and visual dominance as it led up to the question of whether what you see is what you feel—redirected touching.

Program