The “just tell me when it’s ready” approach won’t work much longer.
Imagine if you had taken that approach to email, social media or the mobile web. Late majority businesses struggle in the crowded and unforgiving land of late majority customers. If you think that you should start building WordPress sites for a living… I’m afraid you’re too late.
As I’m writing this, the first truly standalone VR headset, Oculus Go, was announced to the world less than 10 days ago. This year, 2017, there are 90 million people experiencing some form of virtual, augmented and mixed reality around the world. What may be even more relevant to you is that research firm, Tractica, estimates a CAGR of 60% for enterprise spending on VR hardware and content from 2016 – 2021, up from $592.3 million to $9.2 billion.
This wave that has been called The 4th Transformation is not about replacing reality — it’s about replacing our screens and enhancing the world around us. It’s about imagining a world that is better than the one we have today. One where we’re more connected, healthier and happier.
This is the problem immersive tech truly solves:
This post is for my fellow strategists, in enterprises and agencies alike, who are wondering about the relevance of virtual, augmented and mixed reality to their business, where the audience is today and how soon they should start planning for the future. We’ll also talk about how designing and building products for VR/AR/MR will be different than the 2D software that your business runs on today and what you can do to prepare for a new paradigm.
Please forgive me if I oscillate between virtual reality, augmented reality and mixed reality. While it’s true the mixed reality will ultimately win the day in most Enterprise settings, the public is far more familiar with the terms VR and AR, as you can see here on Google Trends.
For our conversation here, we’ll focus on what I like to think of as “Productive Reality” – where digital and virtual worlds converge to create our most productive environments for getting into a state of flow. As much as we heart gaming, entertainment and consumer engagement, we’re mostly focused on virtual work in mixed reality.
Immersive tech is not a game. In fact, games are not a game. The Pokemon Go’s and Star Trek Bridge Crew‘s of the world are not just a form of entertainment, they’re a frontier for innovation.
With all the hype about immersive tech, I shouldn’t have to show you this… but just in case I do, here it is…
Plain as day. It’s not an exact science – markets never are. Even if this is close to being true, we should all be paying attention. I’m not hear to convince you to go out and buy a fleet of new headsets, but the evidence is leading me to believe we should all start experimenting with enterprise apps that could be ready very soon.
Forget everything you’ve heard about headsets not being adopted by consumers — we now know that the Enterprise is going first. According to Tech Pro Research, 67 percent of businesses that responded to their study are now considering using AR in the future, while 47 percent are considering VR. Tractica projects enterprise spending on VR/AR will be roughly 35 percent greater than consumer spending on VR/AR entertainment by 2020, not including hardware-related revenues.
According to Business Insider, the largest industry use cases for AR and VR in 2017 will be related to product showcasing in the retail segment. The retail industry will invest $422 million in AR and VR. Retailers like Ikea and Lowes are already using mobile AR to show users what products will look like in their homes or on themselves.
Manufacturing, which includes discrete and process manufacturing, is seeing significant growth in 2017. The manufacturing segment is projected to invest $309 million in AR and VR during the year. AR can assist in the maintenance of equipment — for instance, viewing a machine’s status simply by looking at it via an AR display can help service technicians visualize and identify problems ahead of the job.
Remember the early days, before everyone had smartphones and tablets and we saw large enterprise hardware rollouts? You may see those again, but with Augmented Reality and Mixed Reality headsets.
2. Your consumer customers will have completely adopted headsets around 2020.
Image: Unimersiv, a platform for immersive educational content
Sometimes I think we forget that 2020 is less than 3 years away. Like the switch from desktop to smartphone, the switch to headset will be gradual. Today mobile AR is introducing users to the concept, but ultimately we’ll want to switch to headsets to enhance the experience. No surprise, your kids are probably already more familiar with AR and VR than you are. In fact, TechCrunch tells us:
“Our new base case is that mobile AR could become the primary driver of a $108 billion VR/AR market by 2021 (underperform $94 billion, outperform $122 billion), with AR taking the lion’s share of $83 billion and VR $25 billion.”
While Oculus Chief Scientist, Michael Abrash, has said that headsets aren’t likely to replace smartphones until 2022, we’ll get used to having them around our homes very quickly.
3. Sorry… you’re going to have to re-think everything… again.
Yes, not long from now you and everyone else will be coming into work and putting on a headset instead of sitting in front of a computer. In fact, you might put on a headset from anywhere in the world and become instantly connected with your job, colleagues and clients. Our work will no longer be imprisoned to a screen, but instead become a part of our physical space. For this world to exist, all of the digital tools and content that we consume will need to be revisited for the new medium. Productivity tools, marketing websites and other applications could have a new life and new meaning in our more productive reality.
If you still can’t find it in you to get excited about your new Productive Reality (come on, we’ll make it a thing) then, I’m sorry, but I have to make you watch a TED Talk.
How immersive products are becoming more than entertainment.
As a UX’er, creative and reluctant-marketer, I’m wired to start with the humans. We need to understand the tasks that have always been in our human nature to do that are now possible with mixed reality. Not to be too cliche, but we need to look at the jobs-to-be-done. Holla #JTBD fans!
There are a few types of core jobs or customer benefits that are emerging through immersive tech, especially mixed reality.
Mixed Reality Job #1: see it without being there.
Yes, that hologram is a real dude, and he can see everything happening around him. S.O.A.B.!
Today we all have limits for where we can be and what we can do. VR/AR/MR can minimize those limits my letting us experience real-time things that are happening outside of our physical location like events, meetings and other off-site activities.
Mixed Reality Job #2: demonstrate without being there.
With VR / AR / MR you can see exactly what someone else is seeing. This means you can demonstrate with physical objects and collaborate in real time. Soon there will be no more awkward digital whiteboards. No more wishing that someone could be in the room to look at something with you — you can just show it.
Mixed Reality Job #3: See physical and digital data at once.
With AR / MR, you can visualize data and designs in the real world. If you’re leveraging the data you or your customers have access to, you can apply it in more useful and efficient ways than you can today.
Eventually, your headset will house everything. No more laptops. No more phones. Just one headset — or better yet, a pair of contact lenses. Simply not having to pull a phone out of your pocket will be a major enhancement in usability.
There are plenty of things we can imagine doing with immersive tech in the enterprise, but we’re still a few years off from account managers having holographic client meetings.
Here are some of the use cases for things happening in the virtual space right now — things that Enterprises will likely be adopting over the next 3 years. You’ll notice that all of them are manifestations of the 4 major themes above.
While barely any of these applications are available off-the-shelf, these headsets are developer ready so agencies and enterprises can start pushing the limits of innovation like we’ve all been waiting for.
Collaborate from anywhere and all see the same thing
Annotate physical designs
Overlay digital designs on physical objects
Manipulate designs in real time
This completely changes remote working, distributed teams and even workforce shortages. Starting now, you can actually hire anyone from anywhere in the world, and it’s not much different than having them work in your office.
I know I mentioned this video already, but it’s really a fantastic example of real-time collaboration in design.
I get asked a lot about how VR will change the day-to-day of the knowledge worker. If I work in spreadsheets all day, will my life really change much?
Meta is changing the game here. You’ve already seen how our daily work environment can change with spatial awareness and screen less content. There’s less hardware than ever.
There are still many 2D experiences that can translate nicely into 3D. Meta, as well as Facebook’s new Oculus Dash UI, allows you to browse the 2D web in a 3D environment. The headset allows you to become more mobile and also plays nicely with other screens.
And yes, you can build for Meta right now! Today it’s still tethered to a PC… but we all know they’re working on that. MagicLeap seems to be making tremendous advances in our mixed reality workspace as well.
For my part, I want to take my headset to southern Italy and work from there all summer.
For Distributed Teams
Mark Zuckerberg introducing advancements in social VR at Oculus Connect 4, Oct 2017
You’ve already seen several examples of enterprises using mixed reality to collaborate remotely. However, I haven’t mentioned Facebook Spaces, a virtual meeting environment available right now inside of Oculus headsets. Spaces is not only the beginning of social VR, it’s a new way to meet. Is it really better than a video chat or a hologram? The jury is still out. So far, most things we’ve seen from Facebook is in a completely virtual environment – but their comments allude to much more in mixed and augmented reality as well.
Several companies, including Wal-Mart, General Electric and Boeing, are experimenting with training customer service employees, factory workers and even customers through immersive experiences. You can create applications that deliver handless hands-on training, have employees practice tasks and interactions, or encourage empathy by putting them in the customer’s mindset. They can even experience the job before you hire them.
“Whether training soldiers on a combat field or sales reps at the customer location, virtual reality provides the ability to enter the world to train and get better, without ever leaving your office,” he said. “This would let every company be able to train every person, more often than ever.”
What you can do with immersive tech in the next 12 months
Right now the market is very platform specific. Each platform has a unique approach to interaction design, usability and features which makes the on-boarding to the medium a little complex. To be effective product strategists, we need to understand our capabilities and limits — so here are the UX tools on the market and my best attempt at capturing the variations between platforms. SME’s feel free to chime in and tell me what we’re missing here.
Objects can be fixed while we move
In AR platforms like Facebook’s or Apple’s, you can place images atop the camera’s viewfinder live using technology called Precise Location. If you want your user to be able to drop an image onto a conference room table, you’ll use this capability. They’ll be able to drop the image on the table, then walk around the room while the image stays put. More on using Precise Location.
We can have avatars
Through motion and gesture tracking tech, every headset has their own demonstration of avatar rendering. I particularly like this 2016 approach from Oculus — the monochrome just does it for me. Hololens, Meta, Oculus, and MagicLeap (I suspect, although I haven’t seen a demo… pretty much no one has. What the hell are they doing over there?) all have avatar options.
We can layer digital information and objects onto physical things
You can do this through AR headsets as well as AR enabled smartphones. Overlay things like machine performance data, project statuses, customer reviews, internal Slack messages, directions, notes… go nuts.
With 3D Effects, Facebook’s AR platform can build out limited 3D environments using a 2D photo. Zuckerberg showed us an image of a small room, and how he was able to pan about and drop in bouncing balls and fill the room with Skittles. So, now you can show the latest trade show exhibit design to your team or clients with an immersive photo — that’s pretty cool.
We can extend the reach of existing 2D software
Go beyond the monitor, like in this example with Autodesk and Hololens for industrial design. If there’s benefit in visualizing data from current programs, you should be able to do it with AR and VR headsets — especially now that we can use web browsers in Oculus, Meta and Hololens.
Our devices can recognize physical objects, and react
With Object Recognition, your app can recognize objects, like a wrist, and apply actions to it, like adding a watch. This is a common function of pretty much all AR platforms for mobile and headsets. This is a clear example of an experience we’ll enjoy on a phone, but would enjoy more on a headset.
We can interact using gestures
There’s certainly no common gesture language yet — everyone seems to be thinking about it differently.
For example, Meta isn’t looking to transfer our common 2D gestures to 3D – they’re investigating the neuroscience that would make holographic interactions natural to our brain.
Microsoft, on the other hand, has taken a more scripted but still intuitive approach to gestures. You really only need to memorize two gestures to get the basics but they’ve built a pretty thorough library to use for more complex functions. Most of these gestures are a combination of eye and hand tracking, like looking at something – then clicking on it or dragging it.
The VR approach to gestures is also evolving. LeapMotion seems to be killing it with their hand tracking hardware and software.
If you’re planning your Enterprise UX in these early stages of the technology, it’s probably wise to choose 2-3 headsets that you think you’ll build for, and test the gestures and interactions for your use case. There’s really no substitute for getting your hands on the tech… no pun intended.
Using Unity, you can build voice commands into your app and, as far as I can tell, they’ll work on any headset. IBM is experimenting with this using their super computer, Watson, on the Starship Enterprise in Vive. What’s amazing about Watson is, instead of having to specifically use the words “shields up” the user can say “we need shields now” and Watson interprets the command correctly.
There are not enough people watching this video… it’s really f*cking cool.
Controllers may not be the future of Enterprise VR, but today they are a part of the toolkit — especially for VR headsets like the HTC Vive or the Oculus Rift. They’re fun tools for gaming, and if your Enterprise app includes a gun then they might make sense. Otherwise, I think controllers get awkward, especially when you try to start typing things like in this Oculus example. Pointing at and clicking on keys is not a solution. I’ve seen very few strong use cases for controllers in the Enterprise – but I’m happy to be proven wrong.
This video gives a demo of annotations on Hololens. While I haven’t found demos of this on other AR headsets, with Apple’s AR kit, you can record sound in space, like so:
WebVR is an open spec that makes it possible to experience VR in your browser. The goal is to make it easier for everyone to get into VR experiences, no matter what device they have. This is still extremely experimental and pretty difficult to deliver on, but clearly has potential. Google has famously published experiments in Web VR that take you to other planets or create tremendous empathy for people in another part of the world.
Mozilla has also taken the lead in WebVR, launching its A-Frame VR content authoring tool in 2015, along with its MozVR resource.
You can also build WebVR content using common frameworks like React VR, Argon.js, PlayCanvas, JanusVR and Primrose.
Developers are just now starting to build WebVR experiences that really take advantage of motion controllers. For the most part, the apps and games you’ll find are built to run on just head tracking and voice commands.
If you’re building experiences, let’s say in advertising or branded content, Web VR might make sense.
How to think about building virtual apps today.
An approach to product design and UX strategy for VR / AR / MR.
“Each time a new technology comes along, new designers make the same horrible mistakes as their predecessors. Technologists are not noted for learning from the errors of the past. They look forward, not behind, so they repeat the same problems over and over again. The most egregious failures always come from the developers of the most recent technologies.”
Donald A. Norman, The Design of Everyday Things
Let’s talk about making immersive apps.
You wouldn’t be in business if you were a pure optimist about all new things. Straight up — here are the risks you run today.
Your chosen headset might not stick. The platform war is real and the winners are still unclear. If you ever built apps for the Windows phone or worked really hard building an audience on Google+ you know what this risk is like. If you want your app to have staying power you’ll probably have to build for multiple platforms. There can be some advantages to building with a company that is investing in multiple headset technologies, like Oculus, Google or Microsoft, because software is starting to transcend across their headset portfolio. One thing is for sure, the headsets we’re using today are caveman tools compared to what we’ll have 5 years from now.
To tether or not to tether. You’ll have to choose between the tradeoffs of untethered free range of motion that the Hololens or new OculcusGo provides and the more high-fidelity tethered headset like Meta or the HTC Vive.
The build will cost more than normal apps. Building for VR is still very tricky and many developers are having to learn new skillsets and test/learn on the job.
The Product Design Process
Outside of the basic “insights > design > build > test > iterate” I’m a firm believer that UX is unique per project and leverages a toolkit for achieving each phase. To be a good product strategist in this new realm, you need to know what’s possible and you need to have tried out a headset. If nothing else, start out with something simple like Google Daydream or the Samsung Gear VR. You can also start experimenting on your phone with apps like these.
Gathering Inputs and Identifying Unmet Needs
This shouldn’t be much different from your normal digital projects — if you’ve found a proven method for gathering insights and building digital products, a version of it should work for immersive. You must identify a product strategy based on real jobs that real humans need done, not novelty experiences. I’m a big fan of the process that Strategyn teaches and a combination of Blue Ocean Strategy, JTBD and Design Thinking can be very effective.
Identifying Your Hardware
Popular Augmented and Mixed Reality Headsets for Enterprise:
Hololens — $3,000 developer edition and $5,000 enterprise edition
Here’s a more comprehensive infographic on the immersive tech ecosystem.
Designing the Product Experience
You may find concepting to be a bit different between 2D and 3D. I’ve already found it’s very difficult to wireframe for VR. You really have to get to code and headset testing as fast as possible. You might think that traditional storyboards would do the trick, but for a truly immersive product, the story will never be completely linear. Creating shorter stories around user tasks will probably be more useful than treating the product like a video.
When it comes to experience design concepts, I’m suggesting that my fellow UX’ers start exploring basic prototyping in a game engine like Unity. The community is starting to build out-of-the-box scenes and building blocks that can be used for low-fidelity design and testing. There will be a lot of things that we’ll need to learn as digital designers that architects and interior designers have known for a long time. I’m sure I’ll be writing more in this concept soon.
On usability, there’s much to be said. Far too much for this post. Suffice it to say, the immersive experience needs to convince the user’s brain that what they’re seeing is real. We do this by considering some new fundamentals, like:
Providing alternative interaction options for users that are prone to motion sickness
Incorporating the interface into the real world instead of fixed to the user or screen
Incorporating appropriate sounds for the user’s context
Considering the user’s movement in relation to their headset’s capabilities
More life-like interactions instead of mouse clicks and keyboards
Incorporating instructions into the world
Creating modals that are high fidelity enough to read
Generally less words and more visuals
This are just some of the new considerations that software product teams will have to experiment with. This video from Mike Alger is a really nice overview on the future of interaction design for VR.
Chances are, for your enterprise use case you’ll be building for an audience that already exists, like a workforce or a current software user base. Most headsets come with some sort of distribution mechanism — usually some version of a store, like the Rift App Store or this Hololens showcase. Others are on their way, like the app store for Meta. In most cases, deploying your own app to the headset from your PC is not much different than deploying to your phone — this may be the leanest way to push and test. Oculus covered new channels of distributing your app at the recent Oculus Connect 4 — start watching around the 1hr mark.
Testing in AR / MR
Do your users have an awareness of the world around them? Are they bumping into things? If they knock something over in this space, what are the consequences? Will it be a spill to clean up? Or a pricey repair bill? Your AR design must take all of this into account, and testing is where you find out the answers to these questions.
Using a Design Thinking approach for all immersive continues to make plenty of sense. Immersive apps are so dependent on the user’s body that it’s important to have real users and SMEs become a part of the process as early as possible. Immersive experiences can ultimately decrease interaction cost, cognitive load, and attention switching, but there are some major differences from traditional testing. For example:
Since AR / MR happens in the real world, you must test in the real world. This means less control over the testing environment.
You’ll need to set up a video stream to see and record what the user sees. This view is imperfect because you’re watching from a monitor instead of viewing in a headset.
The “think-allowed” protocol is useful, but careful observation of the user is even more critical.
As your user base grows it will make sense to start tracking usage analytics. These tools are definitely in their infancy, but here’s an example of tool that seems to be gaining traction. I haven’t tested any of these myself yet.
It’s time to experiment. As for me, I’m starting some side projects on my own and hope to continue working in immersive tech in a full-time capacity. I don’t know about you, but I’m getting bored of the 2D web after getting immersed in immersive products. There’s so much to learn here it’s staggering!
Harrison is a product strategist & designer on a mission to create products that change markets. By bringing a history of brand and marketing knowledge to product design, he leads teams to architect elegant, usable and purposeful experiences that customers will love and businesses can sell.