UX STRAT logo

For People Who Guide Design

XD Immersive Interview: David Birnbaum, Immersion

Paul: Can you tell us a little bit about yourself? Your current job role, the company, maybe a bio that kind of led you to this point?

David: My name is David Birnbaum and I direct the UX design team at Immersion Corporation. You probably haven’t heard of Immersion, but you’ve probably felt our technology. We’re a small company but we work with some of the biggest companies to implement haptics. If you felt rumble in a game controller, it’s likely that that was our technology. We also developed some of the first force feedback joysticks and steering wheels for gaming.

We transitioned into mobile, and we developed haptic feedback systems for mobile phones. In mobile phones, when you use the motor to signal information to the user, that probably has its roots in our research. We license our technology and IP to companies, in fact many marquee names around the world, to implement haptic feedback for mobile.

The other exciting thing about Immersion is that we have a pedigree in VR. Way back in the day, during the “VR 1.0” era, we manufactured exoskeletons for arms and hands that would let you feel virtual objects. They were extremely expensive, and we would sell only a few per year to large industrial R&D facilities. So, we have that behind us, and as VR 2.0 debuts, and AR develops, we can draw upon our expertise and institutional knowledge.

AR and VR are my current focus areas, although I work on mobile technologies as well. I still find mobile user interfaces really interesting because there are lots of small “microinteractions” that can utilize haptics to make a more user-friendly product. As we move into a world where virtual objects are mixed with reality in a seamless way, there’s going to be a situation where like half the stuff around you is physical, that you can touch. The other half is ghosts that slip through your fingers, and that’s a problem, right? So, we’re currently looking at that and trying to solve that problem with haptic interfaces for mixed reality.

Paul: Cool. Well, what about yourself? Did you study user experience; how did you get into it in the first place?

David: I did not study user experience. I have two music degrees. After school I worked for a short time in the record industry. When file sharing sank the long-standing music industry model, I went back and studied musical instrument design, and I became fascinated by what it is about a musical instrument that enables expert performance, or virtuosic performance; what is it about an instrument’s interface that lets you practice every day and get better at playing it, as opposed to something that’s a toy that you figure out how to use and then you put down.

For example, the mouse is similar to a toy in the sense of you know how to use it, but you’re not getting better and better at using it. You learned how to use it once and now that’s it. And there are specific reasons that the mouse has that property. The first one that comes to mind is that its positioning system is relative, not absolute. The second one is that it has very limited tactile feedback and gesture sensitivity. The point is, looking at musical instruments taught me that you can break down physical interfaces into components that will tell you how deep or flexible the interface is for expression and control. That research led me to haptics, and I knew as soon as I felt my first haptic interface that the haptic field could sustain my attention for a long time. I remember very clearly, I was in a summer program and I learned to build a force feedback device out of an old hard drive and a microcontroller. It was an “a-ha!” moment for me. First of all, this is an untapped design area. This is something that people aren’t really thinking about. The field of haptics was driven by engineering labs; there were not a lot of tools, and not a lot of effort behind the creative side. There was some, but it was neglected as compared to visual and audible media. So, I thought, “I could get involved in this, and I could make a difference.”

And, by the way, this will one day be a multi-billion-dollar industry, right? Because, the idea that all of our technology that we’ve had so far, all the problems that we solve in UX, they’re for your eyes and a little bit for your ears. And yet the sense that haptics engages, which is your embodied sense of touch, is the most important sense for telling you who you are, where you are in the world in relationship to other objects, what you’re holding, what or who you’re interacting with… and yet it was almost completely overlooked until recently.

So, I decided I wanted to devote my efforts to developing tactile or haptic design. I joined an R&D team at Immersion, so I was very much on the technical side at first. I was developing hardware prototypes and software prototypes. I was a UX person at heart, but I didn’t know what that was called at the time, and I don’t think anybody else did either. I would just draw these diagrams of what turned out to be storyboards and interaction design specifications, but I didn’t have that background. So, I had to not only teach myself how to do that and go back and take the courses and get involved in the community and learn from other people, but I also had to kind of sell the idea of UX internally as something that Immersion needed. That was also a formative experience because it forced me to articulate and communicate the value of UX to a diverse group of technologists and business people.

That’s why I love UX STRAT, because I went through a lot of the problems that others talk about at that conference. At that first UX STRAT conference, we were all talking about the same problems. How do we get a seat at the table with the executive team? How do we present these ideas? How do we make sure we’re aligned with product?” All these things. It was really amazing; it was amazing timing for me, because it just made a huge difference in the way that I approached my job.

Paul: So, David, could you tell us about your upcoming workshop?

David: Well, let me just start by giving a little bit more detail into augmented reality and how we’re trying to solve that problem from a haptic perspective, and then I’ll get into kind of what we’re showing at XD IMMERSIVE.

In augmented reality, you have a major problem for haptics. This is different from VR, where haptics is kind of a solved problem in a way. Since the 1990s, there have been lots of research papers written on the benefits of haptics to VR for productivity, the feeling of immersion, and the illusion of presence. For example, if you add well-designed haptics to a VR experience, you are almost certainly going to make it a better one. We don’t need to convince anyone of that; we just need to continue developing the technology and the tools. We’ve been doing that as we have developed prototypes and products for VR controllers. VR controllers are friendly to haptics. A controller is charged on a charging station, it has triggers and buttons on it, and there’s a lot of room to integrate haptic features. And, as the user, because you’re already strapping a big, inconvenient headset on, being asked to put a big glove or controller on your hand is not that big of an ask. So, haptics for VR is an easier problem to solve than AR.

With augmented reality, you need to allow the people using the interface to go about their daily lives. They have to be able to hold a steering wheel, hold the handle of a shopping cart, interact with their phones and their pens. They have to turn doorknobs, they have to shake other peoples’ hands. These things cannot be interrupted or degraded by the need for a peripheral held in the hand. So, this is the problem we set out to solve, or at least make progress against over the past few months. We’ve been doing generative research, interviewing product designers and artists, drone pilots, live streaming influencers in China, and other kinds of interesting, diverse people who have some touchpoint with AR.

This led us to the creation of what we call the Ring, which is a thin component that you wear around one finger. It has wires coming off of it today, but one day we hope those go away. The Ring allows you to touch objects in mid-air. We decided that it should not simply vibrate. It needed other unique characteristics. We’ve been doing vibration for many years, we know how to do that. We need to move beyond it. We actually think that a vibration ring could enhance the AR experience on its own, but we want to show a visionary demo of next-generation haptics. We thought, AR is coming in five years, so what’s going to be happening in haptics in five years? We need to show that. So, we needed something that was more organic and more nuanced, with a wider palette of design possibility than just vibration.

So, we used a kind of exotic material to fabricate the actuator, which allows the Ring to squeeze and flutter your finger in these ways that just feels distinct from vibration. When we were looking back through the generative research, a question we asked ourselves was, “Which use cases could be used to leverage this organic feel?” Where we landed was a mid-air painting app we call ARt: Augmented Reality Touch, which, of course, spells “art.” It lets you paint in mid-air, and you can feel the liquid of the paint. We also have a palette that you hold in your other hand, made out of a large touchpad. There’s no screen on it, but when you look at it through your AR smart glasses, the palette is augmented. You can see graphics floating around, and you can place virtual graphics on it. That’s where you can mix color. So, you can suck up paint from a paint blob, and you can mix together a new color, painting and feeling the medium as you do it.

All of this stemmed from an ambition to show the future of haptic AR. But having said that, it’s not our intention to say, “Painting is the future of haptic AR.” That’s not it at all. We’re just using that as a context to show off this technology.

Paul: So which parts of that do you plan to teach people? What can they learn in the workshop?

David: Today, people generally understand what the word “haptics” means. With the Taptic Engine and with VR coming to fruition, people now understand that word, so we can skip most of that. However, I’ll still introduce a short section about haptic design. How is haptics useful to designers? What are the tropes, and the tools, and the approaches that we think about all the time? For example, we think about things like texture or about transmitting levels of urgency--that works really well with haptics. You can imagine knocking on a door in the usual way – the knocking pattern and intensity can feel friendly or urgent. We can do that with haptics. There are certain design tools that we have in our tool kit that we know we can go back to, and we have a design system.

I’ll present how you, as a designer, can think “haptically.” And that does matter to you, because even if you’re not building products with next-generation haptic technologies, you are probably designing apps, and these apps are experienced on phones, which have an interface to the actuator. The design of apps can often be stronger when designers think multimodally about how visual, audio, and touch are working together to present information to people. It’s just an untapped value that people don’t know they can utilize.

So, that’s the first thing, and I’ll show some concrete examples. We have demonstrations on mobile phones of various haptic use cases, and we can go through some of those. We can also go through the design story of the LG V30. This was a phone that was released just a few months ago, and it won several design awards. It’s the first time I’ve seen technology reviewers and journalists saying on the internet, “This thing has a great feel, and you need to demand this level of haptics in your next Android phone.”

Then, beyond mobile devices, I plan to talk more about the challenges around AR and how, as we move into this new world of mixed reality, we can start to think multimodally. We should be able to think about what gesture you’re using, and what haptic feedback that gesture should generate, the problems associated with virtual object scaling and how haptics can reflect that.

After that, I plan to show mobile demos. I can show ARt, the experience, as well as a demonstration of a 3D user interface with haptic navigation. If we want to go even further into it, I can talk about haptic media also. Over the past few years I worked extensively on designing custom haptic tracks that synchronize to video content. This technique was deployed to the market and used for movie trailers and ads. What’s interesting about all that is we did a neurological study about the impact of haptics on people’s brains while they experienced video. We gained insights about how haptics affects people’s tendencies to engage with the ad, to recall images from the ad, and things like that.

Paul: Okay, cool. As you look into the near-term future, the next three to five years, what’s your crystal ball say about haptics? What do you see as the next gen of haptics that’s coming up?

David: I believe that if and when VR becomes something that is in everyone’s living room, then we’re going to see an explosion of innovation making that experience more immersive. Console gamers have had haptics a long time; rumble has been around for 20 years. Game controllers vibrate, and they almost always have. So, gamers expect that experience. I see VR as kind of an evolution of the console experience, and I would expect there to be increasingly advanced haptics as VR improves.

There’s going to be a variety of things that you can do with haptic technology in the future. Haptics is an umbrella under which a lot rests. You can define haptics as technology for touch, but it’s more complicated than audio and video. Instead of hearing and stereo or seeing with rods and cones within two eyes, with touch you feel mechanical deformation of the skin in four different channels. Your ability to perceive external objects is a result of the integration of those four channels of information that takes place in your brain in a way that we don’t fully understand. Then there’s your sense of hot and cold, which is significant because these are two different channels, and you can use the conflict between those two channels in interesting ways. So, there’s just a lot of possibility. If you assume that the various games and content pieces that are produced for VR will need to differentiate from each other, we fully expect haptics to be a part of that differentiation, and so that would drive some innovation. It’s an exciting time to get involved with haptics, and I’m proud to be at the forefront of this trend.