 follow


In 2015 Samsung experienced a severe drop in earnings. They knew they needed to think beyond mobile and hardware to reinvent the future of communication. They looked to entrepreneurial designers to lead the way. Here is what we did...

January 2015

To comply with my non-disclosure agreement, I have omitted and obfuscated confidential information in this case study. The information in this case study is my own and does not necessarily reflect the views of Samsung.

My Role

I led the design of Samsung’s ‘Team 13’ Mixed Reality (VR + AR) products since the outset of the project in January 2014. This was early VR research. The Oculus DK2 was the only HMD available (this was before the Facebook acquisition); we were researching VR UX prior to the announcement of products like the HTC Vive, Hololens, Gear VR, networked HMDs or tracked hand controllers. Many of the things we discovered in 2014 became commonly accepted wisdom over the next 2 years.

Up until 2013, I was leading the design team at my startup, Dekko. Here we were faced with similar design challenges except without Hololens or Google’s Tango as case studies or a market understanding of Mixed Reality’s opportunities and possibilities. But more on that later.

Back at Samsung, our team went about discovering, designing and iterating on Mixed Reality products in the following ways.

Customer Insights & Ideation

I partnered with the engineers, two other UI designers, and an Industrial Designer to uncover insights and translate prototypes into a product that would address customer behaviors and motivations.

Experience Strategy & Vision

I created frameworks and ranking criteria to create the MVP vision, design principles and market value proposition. This helped to evangelize ideas, gain alignment, and drive decision making.

Planning & Scope Definition

I defined the product with my project manager partners. I evangelized customer goals and balanced business goals. I prioritized and negotiated features for concept and into prototyping.

Oversight & Coordination

I designed across, and collaborated with, many Samsung product groups to translate product opportunities into possible hardware and software products.

Design Execution & Validation

I designed for all available VR headsets and input devices as well as Android Watch using vendors’ firmware and Unity and Unreal game engines, but we also engineered our own interactions. I executed journeys, wireframes and prototypes, as well as UI design and technical specs.


I designed up and presented works to gain buy‐in from Samsung C-level executives, senior stakeholders and many other teams throughout the project lifecycle.


Share experiences using Mixed Reality and VR.

Phones brought us closer, the internet globalized business, and video calls brought friends and family together, but nothing has brought us an emotional connection like presence in Mixed Reality (AR & VR) has. We believed Mixed Reality was going to be bigger and grow faster than any other communication medium before.

Mixed Reality is going to be bigger and grow faster than any other communication medium before.

Our vision for Samsung was that communication (voice / video / photo / social / messaging / sharing) is the killer app for mobile and will also be equally crucial for the success of the next platform. While everyone was focused on VR-entertainment, we saw communication as much larger market, and we wanted to explore its product opportunities. We had more questions than answers. It was now my job to empower our team, ignite our passions, and apply our vast experience to design a product Samsung could launch in 12 months.

We knew that whatever product we built had to address the entire connected ecosystem (connected devices, screen, api’s, and the cloud) as well as the customer journey and personas


Prototype bite-sized MVPs

We had 6 months to convince Samsung, potential customers, and ourselves of the right use-case and another 6 months to implement and launch the MVP. The assumption was simple — VR/AR/Mixed Reality/360 will give us the emotional connection necessary to have meaningful experiences like never before. We saw VR as an extension of natural human communication, but not a replacement for it.

Once we had our assumption, we pulled together an amazing team of people with diverse skills from hardware, engineering, UX, game design, and even comic book illustration. We each had a depth of experience in VR or AR, which contributed to valuable opinions and insights. With an openness to learn from each other, I led a Lean UX Design process that would keep us on track but be flexible enough to adjust course as we learned. The process worked and improved as we worked together:

  1. Brainstorm: Throw away assumptions. Rethink everything.
  2. Prioritizing ideas: What is better (or only possible) in Mixed Reality.
  3. Implement: Focus on details and polish.
  4. Document: Document what you’ve learnt to have impact.
  5. Synthesis: No sense making a prototype if its purpose is not understood, shared, and iterated.

One of hardest things I found about prototyping quickly with immature technology was being ok, even happy, with incomplete and rough results. If we were too precious, feedback wouldn’t have been as honest; we would have lost focus and slowed down. We made the right decision to set scope each feature up-front, test, learn and iterate.



Blue Sky to one Market Vertical: The challenge, ‘Share experiences using Mixed Reality & VR,’ was very broad. Using our ranking criteria, more market research, and expert reviews, we narrowed our interest to just one market vertical “Life Sharing.”

From many use-cases to four: Using our ranking criteria, industry knowledge, and our gut feelings, we picked four from the hundreds of use-cases we'd brainstormed. We built, prototyped, tested, gained insights, and iterated on each of the finalists with the purpose of ruling them out of our final go-to-market product. 

  1. Indoor navigation
    Eventually we will be able to find groceries with our Mixed Reality glasses, find a friend in a crowded room, or be taken on a tour around a museum. We built a rough, working indoor-navigation system. We learnt the inside-out position-tracking software was not going to be commercially viable within 12 months. We scrapped it.
  2. Draw on objects
    Support centers will help consumers setup and fix their appliances by drawing and pointing to the physical device while sharing the same view using Mixed Reality glasses. We built rough paper mockups and tested them with users. Users found the experience too different from existing behavior, and we didn’t believe Samsung would support a B2B business model. So we scrapped this idea too.
  3. Share a virtual object
    We built a set of virtual drums, assigned drums sounds to each one, and made it a multiplayer experience. Users loved this sharing this experience with friends. However, the social dynamics became awkward when the participants didn’t know one another or someone wasn’t taking a role; we often saw one person become the ‘teacher’ and the other the ‘student.’ We continued to iterate on this use case, gaining more amazing insights. Eventually, we put this on hold until better hardware emerged.
  4. Social VR
    The hypothesis for this prototype was, “Social VR is successful when I believe you are present in my world.” More to add...


Thinking through Use-Cases

The biggest challenge we faced throughout the project was prioritizing ideas. We had so many ideas, so many unknowns, and a tight deadline. We needed to learn against our MVP objective. We needed to come up with an MVP we believed in enough to convince Samsung to make this a real product. Managing expectations, user feedback, and team opinions was hard. We needed a framework.

I observed this pattern early enough in the project and invested time into creating documentation to help alleviate the data crutch and better articulate and distribute design rationale. Doing this upfront was quite time consuming, but it saved a lot of back‐and‐forth as the project progressed.

Ranking Hypotheses

Before we even started brainstorming, we laid down some ranking criteria that would let us quickly rule in or out use cases knew in advance we wanted. If any of the following answers were 'no' the idea was quickly discarded. If yes, then we had chance of hitting the sweet spot between Consumer, Technology and Business needs.

Before we even started brainstorming, we established some ranking criteria that allow us to quickly rule on use cases. If any of the following answers were ‘no,’ the idea was quickly discarded. If ‘yes,’ then we had a chance of hitting the sweet spot between consumer, technology and business needs.

  1. Can it be built today and launched in 12 months?
  2. Do we have the skills to build it?
  3. Will it be a mass-market B2C product? Can it scale massively?
  4. Would Samsung manufacture, scale and market it?
  5. Does it truly provide ‘a better emotional understanding?’
  6. Does this tool help us communicate in a better way?
  7. Will the customer need to change their existing behaviors to use the product?

In/Validating Experiements

My aim in using the Lean UX methodology was to give us freedom to brainstorm and tightness to rank our riskiest assumptions, while keeping track of experiments and their results, planning our next prototype, and crucially getting the whole team on the same page.

Instead of creating many documents, I made one that listed our hypothesis, assumptions, and experiments that would validate or invalidate our hypotheses. Working as a diverse team of software designers, hardware engineers, and game designers, our tests explored all types of hardware, software tools, and user interface patterns. Here is a simplified example of this shared document.

Experiment 1/10

We believe families and friends have a limited way of sharing remote experiences, because photos and videos don’t share presence.
SOLUTION: By creating live, streaming 360-video to a VR headset from a 360-camera attached to a phone, we will help friends and family to better share remote experiences.
EXPERIMENT 1/5: Validate or invalidate our hypotheses by using hand gestures and voice with VR. Test setup: Network two VR HMDs to share the same 360-video integrated with hand gesture, 360-audio, 360-comms, and user-controlled drawing tools.



Using remotely connected VR HMDs, intuitive UI, and an input device (like your hands), friends will be able to share immersive life experiences that will connect us in meaningful ways.


Scenario 1 : Many to One

I might be at a birthday party recording Miles blowing out the candles (left). You are at home enjoying this real-time experience. We can talk to each other if we want, and I can see a photo or avatar of you.

We made a prototype 360-camera using a rig and six GoPro cameras and filmed different scenarios. In this example, Miles wants to show his dad his playground. So his mom (me) attached the 360-camera, called his dad and held up the phone. Miles could see an avatar of his dad on her phone. Dad was at home wearing a GearVR and can see Miles. Watch for a second.


Scenario 1 Insights

From testing this video and many others we found many things, three of these are…


Scenario 2 : One to One

Each person wore a HMD with hand-gesture tracking. The HMDs were networked so that users could see and hear the same 360-video remotely.

Like in real life, when sharing an event together and standing next to each other, you point, talk, and look at the same things.

Since both participants were not in the same location, we needed to graphically simulate pointing, looking, and hearing. The first graphic showed where someone was looking, the second signified where someone was pointing, and the third demonstrated when something was out of view. It was interesting that Google later announced a VR-UI that worked in almost the same way. We were on the right path.


Scenario 2 Insights

We learnt “Sharing an event in VR creates a moment of value...”


Hardware Protoypes

360 Camera

The hypothesis for this form factor prototype was, “Social VR is successful when I believe you are present in my world.” To conduct this work, we needed to think about the ecosystem (connected devices, screens, APIs, and the cloud) as much as the app itself. I engaged an industrial designer to render my sketches and specs for the smallest possible mono 360-camera. It was engineered to fit in your pocket and work with existing camera workflows.


Mixed Reality Glasses

In our next prototype we tested our hypothesis, “Sharing an event in VR creates a moment of value.” We often hear people say, “I wish we could have experienced the winning hit in the Giants game,” or “I wish you could really feel what it’s like to jump out of plane with you,” or “I wish we could visit a beach in Australia.” So we created this. Again, we thought about the device ecosystem. With an Industrial Designer, I designed a HMD that could use existing hardware while being something you might not be embarrassed to wear on the bus. It could flip up to just be headphones or flip down to view VR experiences. Ultimately, it needed to be less bulky and easier to carry around, like sunglasses.



Key Customer Insights

We built many functional prototypes using different types of hardware and conducted user tests to drive our prototyping phase. These are the key insights that defined the MVP and business model for launch.

Emotion is the only ROI

The 10x factor came back to the emotional experience. Every other metric didn’t have a unique selling proposition.

Feedback is king and queen

Designing for a new market, a new product, and new behaviors required 100% attention to feedback in user tests and other forms of feedback.

Direct interaction is best

Increasing direct interaction with digital objects led to better usability and increased presence.

Accurate controls

Without incredibly accurate controllers, positioning and tracking failed in every use-case.

  

Increase immersion through human senses  

Increasing the number of senses involved led to strong feelings of ‘immersion’ and made the experience feel more ‘natural.’

Use controllers for longer experiences

Devices such as game controllers work better for longer experiences.

Even simple avatars let you feel connected

Surprisingly, we felt connected to an avatar when just seeing the head move — even if the face and body didn’t change or move.

Add real world elements

Like real life, nothing is ever still, quiet or flat. Adding shadows, sounds, and texture increases the feeling like you’re really ‘there.’

Social norms in VR

In VR, people forget how others see you, and users were often goofy and unconscious of their actions. In AR, the wearer has a desire to look cool and anonymous, while the observers have expectations of social etiquette.


But it’s only the beginning

After presenting our MVP, user findings, and business model, Samsung agreed to fund us! We got down to negotiating next steps, but that’s for another post. For now, here is what we achieved...




Quotes from our users...

"It feels natural because it’s like life, not directed."

"You removed the burden on the conversation by showing us the video experience."

“I would put on a VR Headset to really be there with my family.”

“Seeing your surroundings gives context to your conversation.”

“I was part of what was happening, because they were addressing the cameras directly.”

“Our conversation didn’t need words to describe how we both felt.”


What I learned

It’s still early days

‘Presence’ in VR removes the space between people, connecting them like no other medium.
Users want REAL TIME interactions over passive viewing in VR.
VR needs to be as easy as making a voice call.

One thing I wished I had spent more time on in the beginning was creating our design principles. These would have helped in the decision-making process and galvanized the team to share in the vision.
Here are some principles we might have written...


Design ideas only possible in VR.


Build for existing human behaviors.


Success comes when we experience 10x more meaning.


Design a better (and fun) new reality.


Don’t assume. Build, test, and iterate everything.


Make it life-like. Use shadows, textures, audio and touch.