GDC is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

XRDC Q&A: Owlchemy Labs' Andrew Eiche and Devin Reimer explore VR interactions

At XRDC 2018, Owlchemy Labs Chief Executive Owl Devin Reimer and Chief Technology Owl Andrew Eiche will be taking the stage to share the freshest ideas in interaction design for VR. Since Owlchemy Labs has been at the forefront of room-scale VR experiences, we wanted to pick their brains on your behalf about what VR developers should be thinking about when creating Holodeck-like experience.

Thankfully, Reimer and Eiche were kind enough to answer our questions, which you can now read down below!

Attend XRDC 2018 to learn about AR/VR/MR for games, entertainment, healthcare, enterprise training, education, automotive, and innovative use cases across industries.

Tell us about yourself and your work in VR/AR/MR.

Devin: I’m the Chief Executive Owl at Owlchemy Labs, a VR game studio based in Austin, Texas. We’re the developers behind multi-platform titles ‘Job Simulator’, ‘Rick and Morty: Virtual Rick-ality’, and the upcoming ‘Vacation Simulator’. With a background in Flash and Unity, I’ve worked on numerous 2D and 3D game projects, both before Owlchemy and while the studio was still making non-VR games. Now, I’m all in on VR. When we started Job Simulator, we decided to make games exclusively for VR, and we haven’t looked back. VR is the future!

Andrew: I’m the Chief Technology Owl and Cable Wrangler at Owlchemy Labs. When I joined, I was the production lead on ‘Rick and Morty: Virtual Rick-ality’, a VR title we made with Adult Swim Games (recently nominated for an Emmy Award!). Throughout my career, I’ve developed games for indies, board game companies, commercial clients, and government agencies. Fun Fact: I made an app for the IRS. I also used to wear a suit and tie to work. Things are better now. It’s pretty wild, to help pave the way for the future of VR.

Without spoiling it too much, tell us what you’ll be talking about at XRDC.

Devin + Andrew: We’re going to be discussing our philosophy on various VR design concepts and best practices we’ve identified since the launch of consumer VR. Before the launch of consumer head- and hand-tracked VR (i.e. Vive, Oculus Touch, PSVR), we’ve given talks about the fundamentals of what makes for good interaction design in VR. Think— concepts like “tomato presence”, tracking occlusion strategies, objects breaking out of your hand, et cetera.

It’s been two years since we’ve discussed these base-level interactions, and a lot has changed. We figure now is a good time to reopen the topic and share what we’ve learned about interaction in VR, what has changed since the dawn of consumer VR, and how we’ve built on previously established paradigms and charted new territory to take interactions to the next level.

What excites you most about AR/VR/MR?

Devin: The idea of exploring uncharted territory has always interested me in my work. At Owlchemy, we’re really committed to constantly pushing the envelope and exploring what’s possible with this new medium. Nothing is set in stone, and the time is now to try new things. There’s a pretty potent awareness among the VR community that we’re in a special moment in time— a time to experiment and pioneer things for the future.

Andrew: There is still so much to learn in VR, and we’re discovering new things every day. We’re still building the “design language” of VR. Paradigms are constantly evolving. Developers are learning more about building great experiences, and users are becoming more advanced in the ways they interact with our worlds. It’s electric to be part of a medium that is trying to learn how to walk.

Who would you like to meet at XRDC?

Devin + Andrew: The thing that’s great about XRDC is that it gathers together the huge, diverse range of people who are working in VR now. When we started building Job Simulator, people thought we were crazy to be betting so much on a new technology. Now, its value is clear— whether you’re working in games, narrative content, education, enterprise, or beyond. We’re so early in VR that it’s essential we share our knowledge to elevate the industry as a whole, and there’s a lot we can learn across disciplines. I’m looking forward to chatting with developers from every different vertical in VR/AR/XR about their projects.

What kind of interactions or objects have you worked with in VR that didn’t quite make the cut in your games, and how do you think they could be properly integrated?

Andrew: To anyone curious about the depth of our experimentation at Owlchemy, I would recommend checking out the GDC/XRDC postmortems for both ‘Job Simulator’ and ‘Rick and Morty: Virtual Rick-ality’. We’re constantly revisiting old ideas with new knowledge. I’m particularly interested in how we evolve our character reactions to be more realistic— whether a player is throwing objects, swatting, or generally being a nuisance to them.

Currently, character interaction is an affordance nightmare. If a character reacts to one thing, they have to react to everything, and in almost all situations. We did great work on this in ‘Virtual Rick-ality’ with characters that react and talk to you. In ‘Vacation Simulator’, we’re also really proud of how intuitive it feels interacting with Bots by waving at them. Even so, there’s a lot of room to grow! There are a lot of interactions we cut, and others that we didn’t, but still think we could improve in the future.

I think in order to have a comprehensive character interaction system in VR, we will need to see further advances in AI (e.g. machine learning) to allow for procedural reaction generation. Either that, or an incredible number of development hours. We’ll see which comes first. After all, we do have a reputation of picking the deepest technical rabbit holes!— Eg. 30,000+ first-order combinations in Virtual Rick-ality’s Combinator or; 850+ hours on Job Simulator’s fluid tech.

Devin: The list of things we create during development that doesn’t work is far greater than what does. To us, this is a success— we consider failure to be a win state. It means we are truly exploring what’s possible in the space, and only keeping the best. This mindset is why we find it so important to share our knowledge, and specifically the things that didn’t work for us.

Something Andrew mentioned is that we often revisit old ideas with new knowledge. For example, we really wanted to make sandwiches for the Kitchen Job in ‘Job Simulator’. This was no small feat in VR, and we encountered huge technical and design affordance challenges.  What we settled on was this spike mounted on the counter that you could stack sandwiches on.

Fast forward to our upcoming game— ‘Vacation Simulator’. We revisited the infamous sandwich problem with the burger stand in our Beach demo. After writing (and re-writing) new systems and tech, you can now freely build a burgers in your hands, no special machine needed!

Even if we’ve ‘solved’ a problem before, that doesn’t mean it’s off limits. There’s always room to experiment, and try something new.

XRDC is the premier conference for augmented reality, virtual reality, and mixed reality innovation, produced by organizers of the Game Developers Conference. Subscribe to regular XRDC updates via emailTwitter and Facebook.

Advertisement

Connecting the Global Game Development Community

GDC Vault icon
Game Developers Choice Awards icon
Independent Games Festival (IGF) icon
GameDeveloper.com icon