(I wrote this originally for UploadVR.)
Today’s VR systems are both fantastic and restrictive: they blow you away, but it’s clear how far they have to go. The HTC Vive is arguably the best out there, but having to buy a souped-up laptop just to run it, paying full price for brief games that feel more like demos, and trailing a huge cable off your head and fumbling to mount trackers on your ceiling…it’s not ideal. But it’s still incredible enough to give a taste of where it’s headed.
Here’s my best guess of what the future high-end VR setup looks like. I’m an early-stage VC focused on virtual and augmented reality, so I pieced this together based on the forward-thinking pitches and demos I’ve been lucky enough to see through my work, plus a lifetime of burning through sci-fi and video games. Check out the bottom of this post for a list of VR inspiration.
Side note that AR will be much bigger than VR, in both the diversity of use cases and market size (analysts predict $30B for VR versus $90B for AR by 2020), but I still believe that most homes will have a dedicated VR space for total immersion.
BODY MOVEMENT
Let’s start from the ground up. Forget the room scale debate: the VR setup of the future moves with you. Maybe it uses an omnidirectional treadmill that adjusts speed and incline based on viewer inputs. To be truly immersive, it needs to be around 8 feet by 8 feet, given that the average sprint stride length for men — the longest possible stride variant — is 93 inches. That gives users more than enough space to walk, run, and even sprint while in VR. Or maybe a section of the floor itself serves as the treadmill, raised up as a platform that controls pitch, yaw, roll, and speed.
Of course, not everyone wants to — or can — be on their feet for long periods, and plenty of immersive entertainment, like watching movies, is sedentary. VR experiences will support a seated and reclining mode when appropriate and shouldn’t be more complicated than pulling up a standard chair. Movement in these modes will likely employ similar mechanics to those we’re beginning to see today, like teleportation via gesture or gaze.
TACTILE FEEDBACK
Next up is the bodysuit. To mimic the tactile feedback that you experience in real life, you’ll need sensors and haptics all over your body or at least in significant areas, like the face, hands, and feet. Focused, acute pulses simulate sharp points; broader, more distributed ones can simulate sensations like dipping into water. For those who want to push immersion further, optional climate controls mirror environmental conditions (within a safe temperature range).
The first hardware generation attempting to solve the body feedback problem will likely use full bodysuits with haptic responses aligned to the VR experience. The suit’s gloves will simulate gripping objects by restricting finger movement: wrap your hands around a hard plastic cup in VR, and your gloves will freeze at the point where you can’t squeeze any further. Squishier objects will have more give. It’s possible that putting on a full suit will be too much effort for most people, and they’ll find that hand and facial coverage is enough to give them the immersion level they want. Humans have more nerve receptors in our fingers than anywhere else in the body (besides our feet and lips), so covering tactile input in the hands may be enough to make the mind suspend its disbelief while in VR.
CONTROL OPTIONS
Our future VR setups won’t need controllers. Steve Jobs once said about using the human hand for interaction that “God gave us ten styluses…let’s not invent another.” Imagine the same touch and motion-based actions we’re used to on mobile phones, only happening in the air with our hands while we’re in VR. Need to hold something as part of a VR experience? Your gloves mimic the width and feel of a gun in a first-person shooter, the handle of a scalpel in a surgery simulator, or the stitching on a football…all while you’re empty-handed. Voice UI can supplement gestures with more detailed natural language commands.
Weight is tougher to simulate in VR. The suit could stiffen and slow a user’s movement corresponding to the relative weight of an object: e.g., stooping to pick up a piece of furniture and would force a slow standup, versus an unaffected standup for a feather. Haptic feedback, movement speed in VR, and other techniques could add to the weight effect. For an in-depth discussion of the weight problem in VR today, see “Simulating Weight in VR.”
Eventually we’ll have the option to avoid climbing into suits at all. Neural signals can make users see and feel everything as real. Think plugging into the Matrix, but with the awareness and intent to do so. After all, we experience objective reality today filtered through our own senses; no two people see the same thing in exactly the same way, and we interpret our incoming sensory inputs as real. But despite the option to “jack in” in the distant future, plenty of people will still opt to use the treadmill and haptics combo for the exercise benefits.
VISUALS
The visual input is the most important piece of the VR setup. Right now we’re stuck thinking in terms of head-mounted displays (HMDs), like the Vive or the Oculus. But problems with frame rate lag in display and a small field of vision make people nauseated, and exercise in a headset is pretty unpleasant — picture a heavy piece of hardware bouncing on your face while you’re sweating into its lenses.
Future solutions will get rid of clunky wired headsets and move onto glasses that can project a high-definition image onto the eye, a la Magic Leap, and eventually contact lenses that contain tiny screens. We’d wear these contacts all the time, switching between AR mode (high transparency so you can see the world underneath digital overlays) and VR mode (low transparency so you can achieve full immersion). VR mode will likely happen at home or in private spaces, like offices for virtual meetings or the couch for gaming. AR mode will be everywhere else: calling a heads up display of Google Maps on the street, stopping to catch a Pokemon in a field, or scanning the person in the coffee meeting across from you to cross-reference their LinkedIn profile. Eye tracking will enable more realistic interactions with both NPCs and human avatars, along with detailed analytics, heatmapping, and privacy concerns.
Let’s move to the top of the head. Electroencephalograms (EEGs) can read user brainwaves, converting thought to action. Users might look at an object in VR and think about how they’d like to interact with it. Those interactions can be mapped to brainwaves and translated into action with almost no latency. Whether it’s an EEG housed in the full-body suit or a free-standing one atop the head, these devices will empower users to be hands-free in VR, especially when brain responses combine with eye tracking. VR developers are likely to support brain-computer interfaces in their experiences to add a wow factor: imagine actually having the force in a game like Mass Effect.
SENSORY ADDITIONS
Adding scent is another way to boost VR realism. Devices either on the user’s bodysuit or placed around the room will release smells tied to the VR experience. Companies like International Flavors and Fragrances have already mapped synthetic base smells that, when combined in different ratios, can create almost any scent. Startups will bring platforms to market that will let content creators add a scent layer to their work that a hardware peripheral will release at key moments. Running through Hyrule field in Legend of Zelda VR? You’d smell the grass. Users who want to add more realism could add fans to their VR lair to blow air to simulate wind and falling.
Audio is yet important sensory input to get right with VR. Whether the sounds come from headphones or speakers around the room, developers will build audio into their VR experiences so that the user’s movement and proximity changes corresponding sound accordingly.
Devices running VR apps will be more powerful, smaller, and cheaper than our options today with wireless communication to the viewing apparatus. Maybe it’s a dedicated VR computer to start, but more likely it becomes mobile hardware, given Moore’s Law and the current trend toward everything mobile. Being able to pick up your VR hardware and take it with you is a big benefit.
In the VR room, whether mounted in the periphery or on a user’s bodysuit, you’d find super high-resolution sensors to track a user’s body in three dimensions and scanners to detect and render body positions and facial movements. Imagine a board meeting in VR where the people around the table can see each other’s real-life facial reactions and head leans.
Today, we’re seeing the very beginning of what VR technology will look like. I see a lot of companies and founders trying to take experiences we have today and port them over to VR without much change — like putting a standard browser window into a 3D environment, using the same navigation — but the innovative ones are thinking in entirely new ways about what’s possible. Now’s an amazing time to be alive, and I can’t wait until we’re all hanging out in the metaverse.
Here’s a list by technology type that includes both companies working on these problems today (in bold), plus some sci-fi inspiration for them (in italics).
This isn’t meant to be an exhaustive list, but if I missed something major, please tell me and I’ll add it. Also, please reach out if you’re working on anything cool in this space at sarah(at)accomplice(dot)co.
Hand and finger tracking, gesture interfaces, and grip simulation:
LeapMotion (we’re an investor): https://www.leapmotion.com
Manus VR: https://manus-vr.com
Neurodigital: http://www.gestigon.com/
Gestigon: http://www.gestigon.com
Handpose: https://www.microsoft.com/en-us/research/project/fully-articulated-hand-tracking/
Dexta Robotics: http://www.dextarobotics.com
Minority Report (film)
AR and VR viewers:
HTC Vive (VR): https://www.htcvive.com/us/
Oculus Rift (VR): https://www.oculus.com/
Google Cardboard and Daydream (VR): https://vr.google.com/
Samsung Gear (VR): http://www.samsung.com/global/galaxy/gear-vr/
Playstation VR (VR): https://www.playstation.com/en-us/explore/playstation-vr/
Magic Leap (AR): https://www.magicleap.com/#/home
Microsoft Hololens (AR): https://www.microsoft.com/microsoft-hololens/en-us
Meta (AR): https://www.metavision.com
Snow Crash (novel), by Neal Stephenson
Omnidirectional treadmills:
Virtuix Omni: http://www.virtuix.com
Cyberith Virtualizer: http://cyberith.com/product/
Infinadeck: http://infinadeck.com
Ready Player One (novel), by Ernest Cline
Haptic feedback bodysuits:
Teslasuit: http://teslasuit.io
Nullspace VR (hand, arm, and chest coverage): http://nullspacevr.com/
The Three Body Problem (novel), by Cixin Liu
Brain-computer interfaces:
Neurable (our program Boston Syndicates is an investor): http://neurable.com
Mindmaze: http://www.mindmaze.ch
Emotiv: https://www.emotiv.com/
Accelerando (novel), by Charles Stross
Neural plugins:
The Matrix (film)
Sword Art Online (TV show)
Neuromancer (novel), by William Gibson
Total Recall (film)
Avatar (film)
3D tracking, capture, and/or rendering:
Paracosm: https://paracosm.io
8i: http://8i.com/
PrioVR: http://www.priovr.com/
Uncorporeal: http://www.uncorporeal.com/
Otoy: https://home.otoy.com/
Matterport: https://matterport.com/
Snapchat (we’re an investor through an acquisition): www.snapchat.com
Star Trek (TV show)
Eye tracking:
Fove: http://www.getfove.com
Eyefluence: http://eyefluence.com/
Tobii: https://tobiigaming.com/
SMI: http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/eye-tracking-hmd-upgrade.html
Bladerunner (film), based on Do Androids Dream of Electric Sheep? (novel), by Philip K. Dick
VR audio:
RealSpace: http://realspace3daudio.com
VisiSonics: http://visisonics.com
RedPill VR: http://www.redpillvr.com
Scent creation:
Cyrano/ONotes: http://www.onotes.com/
FEELREAL: http://www.feelreal.com/
Comments