Elizabeth Baron drove innovation forward at the Ford Motor Company since the ’90s, advancing XR technologies in the automotive industry. Now, Elizabeth joins our host Alan as she discusses her new venture, Immersionary Enterprises, as well as her pioneering work at Ford.
Alan: Today’s guest is Elizabeth Baron. Elizabeth has been a true pioneer of virtual and augmented reality as the global lead for immersive realities, bringing together multiple disciplines throughout Ford Motor Company. developing multiple immersive realities using VR, AR, and MR to provide information in context to the design studio, multiple engineering teams, UX developers, and computer-aided engineering analysis and many more. Elizabeth has seen dramatic changes, from huge room-sized, multi-million dollar CAVE systems, to haptic seats, to car cockpits made out of wood. From the promise of virtual reality to it becoming real, Elizabeth has been an industry leader always pushing the limits of technology.
She has just started a new venture called Immersionary Enterprises, to provide probability spaces where an enterprise can study any potential reality, or the art of the impossible or possible, with a host of relevant data. These realities can be shared across a global connected work team for more collaborative decision making. Immersionary Enterprises aims to establish a holistic immersive reviews as a near-perfect communication and collaboration paradigm throughout industrial design and engineering.
It is with great honor that I welcome VR pioneer Elizabeth Baron to the show. Welcome, Elizabeth.
Elizabeth: Oh thank you for having me, Alan. That’s quite an introduction. I really appreciate it.
Alan: Well, you certainly deserve it. You have been in this industry since the very beginning; you have seen some incredible changes, and maybe you can speak to what you’ve seen in the last 30 years of being involved in virtual and augmented reality, from where you started to where you are today.
Elizabeth: Yeah sure. It’s actually quite a transformation I’ve witnessed. It really blows my mind in some regards. So, way back in the day when I started my career at Ford Motor Company, virtual reality was out there, but it wasn’t really a thing in enterprise, per se. And I really became interested in it and started working with it, I would say, before its time. So around the late 90s, I started working in that space and putting together, like, a life-sized human model that could scale to different proportions. We tracked the human through magnetic motion tracking. And since cars are made out of metal, that poses a little bit of a problem, so we created a wood — like, oak and mahogany — adjustable vehicle that could be a small car or a big truck, and then put you in it, and then changed you to be either like a super tall man or a very small woman, and let you do ergonomic assessments. So at that time, we were limited to 60 thousand polys in our entire scene.
Elizabeth: I know! So we were culling data and massaging things and we’d say, you know, two weeks and we’ll have something for you. And we were really working hard because you have to try to represent a vehicle and a person and an environment in 60 thousand polys. So you can imagine what it looked like; it wasn’t very pretty. But we actually were able to get some value out of it. So we progressed from those days to working more with better tools and optical motion tracking became a thing. So that was a big advancement. So we can now work within prototypes of vehicles and it really opened up another whole set of possibilities for us. And so we worked in that regard, and at that time, I really realized the benefit of doing passive haptics. So, putting some of the physical world together with a lot of the virtual world is a really wise idea. And I still think that holds true today.
So we worked on that, and then really the tech evolved, got better visual quality. And then the next big advancement was really the amount of data that we could put in the model and study. So instead of these 60 thousand polygons that were so limiting in what we could do, we were able to get engineering data — real vehicle data — that represented the CAD geometry that was being produced and had a lot of engineering integrity behind it, and then start to get other aspects of engineering, like some computer-aided engineering, some analytics, in the models, and kind of take it from there. Then from there, went to real-time ray tracing, looking at an environment that had real-time reflections, and it measured materials and all of that goodness. So in the end, by the time I left Ford Motor Company, it was quite a nice suite of tools that had a very deep, foundational use in product development, and for manufacturing assessments, too.
Alan: I wonder if it would be, you know, something that would even be possible would be to recreate the timeline of VR through the lens of what you guys were working on, from the 60 thousand polys, to real-time ray tracing — and for those of you who are listening who don’t know, ray tracing is the ability to create reflections of light from different angles, and that’s really important when you’re looking at a vehicle, especially. You want to see, what does it look like in the daylight? In the evening? When the moon shining off of it?
And you guys, you showed on one of your presentations, probably the most photorealistic-looking vehicle. I mean, if you looked at it, you would think it was just a photograph of a car. But it’s in VR. And so you’re looking at this car and it looks exactly like a car standing in front of you. And, I mean, that is a far cry from where you guys started off, and it would be interesting to see in a virtual reality timeline that sort of progression.
Elizabeth: Indeed. Right. And so the interesting thing about that is, the realism actually comes from physics, and I love that. I think that’s how it should be. So, the way the light is being propagated in the scene is based on calculations of the way light behaves physically, and the materials that are in the scene are defined according to how they appear physically. And I think that is a differentiator in how you do virtual reality and actually the visual part of VR for enterprise, because the direction that you’re heading is foundationally correct. It’s not, “Oh this looks really cool it looks really real.” It actually has science behind it, and I think that’s a really important distinction.
Alan: So we’ve kind of come a long way since the beginning days, but let’s talk more about the the actual ROI use case of what you guys were doing at Ford, and what you’re kind of propagating with your new company Immersionary Enterprises. I think the biggest thing that you mentioned is the collaborative tools; being able to collaborate. And I know there was a study recently, or a kind of a case study of Bell helicopter. Normally takes five to six years to design a helicopter, and using virtual reality, they did it in six months. Were you guys seeing similar or… you know, were you seeing better/faster times to market, and what does that look like?
Elizabeth: Absolutely. Better, faster time to market; more things that could be studied in a shorter period of time. So in other words, getting answers sooner, being directionally correct sooner. And also, the way in which the teams worked changed so that, not only did the answers happen better and you get, you know, faster results for those answers, but you’re able to propagate the information sooner, as well. So, more teams can benefit. So there’s a ripple effect in the way product development is done and the way that the information is obtained and disseminated.
Alan: And it is real time, too, so real-time collaboration really is key here. I think one of the things that you are really pushing towards is real-time collaboration and communication. And I think being able to stand in… one of the first experiences I ever had in social VR was in Altspace, and I just remember somebody talking to me, and I turn around, and I couldn’t… I was like, “what?” They said, “oh, you’re new here obviously, because, you know, you don’t know what’s going on,” and just that ability to see somebody else, talk to them. But I mean, you guys took it to the next level. I think in one of the talks, you mentioned that your executives at Ford all go into VR now, before a car is even made.
Elizabeth: Absolutely, and I thought long and hard about what VR would be good for. What AR would be good for. What unique advantages it may or may not have. And one of the things that holds true that I think you’re getting at, is immersion is social. It allows us to amplify meaningful communications, and we can create these infinitely-scalable, connected spaces where we can all relate to the thing we’re producing. Or whatever your enterprise is doing. Everybody knows that, and they’re masters in their own discipline in that thing that you’re making. And then, when you come together as a globally-connected team, you can really create this experience where everyone has a voice. Everyone’s function gets to be properly represented in context, and it allows these really complex stories to be told amongst these multidisciplinary teams in a way that’s, I believe, like no other form of communication.
Alan: It’s pretty incredible. My interview earlier today was with the president of HTC VIVE, Alvin Wang Graylin, and one of the things that they announced last week at the VIVE conference is that they’re now doing eye tracking, and hand tracking, and lip tracking, and I think these are new — fairly new — technologies. I mean, I’m sure you guys were doing some sort of hand tracking with either controllers or gloves, but to be able to have native hand tracking — just put on a headset, you can see what other people are doing, you can engage with them — that’s a game changer, and being able to look somebody in the eyes and understand their body language. I think in a design standpoint when you’re having conversations, I think that’s very important. Can you maybe speak to how these types of new technologies are going to really enhance the experience for users?
Elizabeth: Yeah, absolutely. So a couple points stand out to me on that. One is that immersion is a holistic paradigm. So, you can represent; you can be represented; you can see others who are represented, and then you can look at a world that really doesn’t exist. And then, the more we move toward experiences that have zero physical interface, where we have to put on something or do something different in order to get in that world, the better off we are. And even though, with a VIVE, of course, you’re wearing a device on your face, but it’s doing so much.
I mean, that is just really exciting technology because now you can represent some form of communication regarding how you’re feeling about things, and it kind of breaks down some of those barriers that I think are there when — especially an executive, a C-Suite person — puts a headset on and then they can’t see anyone anymore, and they’re wondering what other people are reacting to. I think it’s important to have the dynamic of the person in that environment also represented. So the more we add in to the environment that is like the physical world, and the easier it is, so we don’t feel like we’re a cyborg when we get in that world, the better this technology will take off and be adapted.
Alan: I agree, and I think there’s some interesting kind of overlap between, you know, virtual reality and mixed reality, or being able to see these types of design communications in augmented reality, and I know Hololens is really leading the way with enterprise augmented reality/mixed reality. Is that a technology that you guys used before, or that you’re using now with clients?
Elizabeth: Oh yeah absolutely. Mixed reality is extremely beneficial. There is a lot of goodness in the physical world as the main part of your experience, and then augmenting that with virtual data that represents the art of the possible. What I love about the immersive paradigm is, you can go full virtual or really full physical because you can be immersed in something physical, and really get benefit out of learning about what’s new.
Alan: Absolutely. So let me ask you more on a personal note; what is one of the best VR experiences you’ve ever had?
Elizabeth: Oh wow, that’s such a great question. Let me think. I don’t know if I could say just one. I will tell you… how about if I tell you one of the most meaningful VR experiences I’ve ever had? How does that sound?
Alan: Sounds great. Sure. And you can list many of you want. I mean, we’re here to learn.
Elizabeth: Okay. So, all right. So, I think the most meaningful experience I had was around putting together the physical world and the virtual world about the time when I would say, I had my new eyes, and I could see in stereo. That is the most meaningful experience I’ve ever had, because I…
Alan: Alright, alright — we’ve got to just go back a little bit, because I know this story and I think it’s important. So, you’ve spent your entire career working in virtual worlds. And up until fairly recently, you couldn’t see in three dimensions, in stereoscopic view. Is that correct?
Elizabeth: That is true.
Alan: So here you are, leveraging the power of a technology that just screams “three dimensions,” and you couldn’t experience it.
Elizabeth: That’s right.
Alan: So you got a surgery. They fix that. And what was it like to go in the first time after your surgery, and experience full three dimensions in virtual space?
Elizabeth: It was… it was amazing. So after I had my surgeries, they basically… I describe it in automotive terms: I had bad camber, castor, and tow. So, my eyes were misaligned in three directions, and they basically realigned them, and it allowed me to triangulate and form a stereoscopic image, and see in 3D. After a period of adjustment — because you can imagine, my world was all-new, and everything was really great and really horrible at the same time — but the first time I got immersed, and I looked at the data and I could see, I stood behind a vehicle and it had, it was like a hatchback. And they opened up the hatch so I could see in, and the feeling of, like, the vanishing point in the seats in front of the seats, and just… and I was moving my head, and I watched the data, you know, move dimensionally with me. I had never seen that before.
And so, what it taught me — and this is the reason why I’m bringing it up — is I think the immersive paradigm has a cognitive/emotional aspect to it that you can’t get by looking at a flat screen. So the meaning behind it, and what I got out of it was, that the emotional connection you get when you are in your world, or you are in a virtual world, and you are learning and sharing and discovering with other people, that is really profound. And when you share something with somebody, and you’re together and excited as a team about this product that you’re putting together? I never really fully understood the power of the connection of immersion before I had my surgeries. I just thought it was useful because it was virtual data, and you could say Option A and Option B and kind of look at things. That’s really where I was at. And then when I had my surgeries, I was just blown away by all of the information that I could get out of this environment, and how I could talk to somebody else about it and they related to it.
Alan: It’s interesting that you talk about the amount of data, and you’re a very analytical person. I’m sure over the course of your career, you’ve managed to collect millions of data points of data from each of these things. What is the main driver? Ford’s a very large company, and they can afford to have kind of skunkworks departments and things like this. But your department wasn’t the skunkworks. This isn’t some VR lab in the corner that is used once a week. This is something that is used by designers and C-levels right across the enterprise all the time now as a tool. This is like, you’re having Adobe and having computers on your desk; this is not a kitschy toy. This is a real tool. When did it go from being a skunkworks project to being a real, validated tool that drives ROI?
Elizabeth: Yeah, so, that’s funny, because while you’re talking, I’m thinking about all the times I was, like, in a garage with a space heater over my head. It was really interesting because I was always grateful that Ford let me try and let me experiment until the time was right to deploy. I really always appreciated that I had that capability, to be like a startup in a multinational company. But I didn’t necessarily have a lot of resources to get it done for a long time, because the timing just wasn’t right for the tech. But the answer to your question is, the collaborative paradigm was the one single thing that sold the tech. So in, I think it was in 2012, there was a need to do a series of collaborative reviews with Australia, and there was a large contingent of people that — including an executive team — that wanted to go to Australia, but had a hard time with their schedules. And then I asked if they could try doing a global collaborative immersive review instead. So I literally had a, I don’t know, maybe 10 or 15 C-level people in my lab, which was really a hoist area. It was a garage at the time. And one guy is holding a dowel rod with you know motion markers on them — mo-cap markers — and a headset, and they’re looking at a 46-inch-screen TV, but they’re seeing what somebody from Australia is seeing, and then somebody from Dearborn, Michigan was looking at, what conversely somebody in Australia is doing.
And that collaboration really sealed it, and they could immediately start to discover things about the product that were good, things that needed changing, and they canceled the trip to Australia and they just did a series of immersive reviews. They could see the power in it, and at that point, some investment came and the technology grew from the garage-band-type approach, to a very well-structured enterprise deployment, where battle prep was handled easily. So, we worked out pipelines and platform issues, and worked to make it global so that we had countries from around the world participating in these immersive reviews. And so, when I left, we looked back and there were… I looked at the amount of attendees going to reviews in a year, and there was over 10,000 people who somehow or other witnessed an immersive review in a year, globally. That’s just phenomenal.
Alan: That’s incredible. Wow. It’s incredible. So you literally kicked off collaborative immersion tools for one of the world’s largest companies.
Elizabeth: Yeah, I know. Wow, cool, when you say it that way! One of the things I probably should point out is… well, a couple of things. One is, nobody does this alone. I worked with a lot of really smart people to make things happen. As they say when you’re raising children, you can only take part of the credit and part of the blame for whatever happens. And another thing that I think is really important, and another part of the reason why it took off (besides the obvious benefits of collaboration), it was the simplicity of how we work with immersive technology. I think it’s really, really important to provide an incredibly simple, very useful method to get immersed in a world. So I came up with these things called, like, “The Tenets of Immersion,” and they were–
Elizabeth: I know, sounds so formal, but it’s basically what we would do, and we would try to never violate these tenets, so that people could come in and understand their data with little or no training. And so, by the end, I think it’s basically no training. Like, literally put this thing on your head and start moving around in the virtual world and it’s going to make sense. We got it down to about 30 seconds by the time I left. But I think that’s important to know; know your audience for enterprise, and know that they’re extremely busy people, and they really don’t want to or shouldn’t have to take the time to learn a whole new way of interacting with data.
Alan: What are the Tenets of Immersion that you’ve come up with? Can you outline those, or some of them?
Elizabeth: Yeah. So, the Tenets of Immersion are really about how quick and easy it is to get immersed. The other thing I called them at one time was “the prime directive.” Basically, it will be quick and easy to get immersed. It will allow… we can simulate any potential reality. (I’m going from memory here.) The hardware that we use will be simple. It will be unobtrusive, and it will allow a natural navigation and interaction with the virtual world. And then, regarding what you see and how you experience it, it will be realistic when that’s… especially for enterprise and what it has to do with engineering, realism is key. Sometimes with art, it’s not. If you’re trying to do something more artsy, you really don’t want realism. But anyway — realistic, real-time, so as far as keeping up frame rates and making sure the experience is a smooth and steady one. Collaborative, so that you can share between the people in the team, and for automotive, I think we also had that it should be full-scale, for when we were looking at vehicles. So, looking at a model and not knowing how you relate to that data could be death for understanding how to assess the goodness or badness of the features. So if you’re looking at tolerance between body panels, and you have no idea what your frame of reference is with the vehicle that you’re looking at, you can’t really assess accurately if that’s a good or a bad margin.
Alan: It’s a really good point. So to recap, the Tenets of Immersion are: how quick and easy it can be get it right into it and get immersed; the fact that you can simulate any potential environment or application; the hardware has to be simple and unobtrusive and work in a natural way; it has to be realistic, both in graphics and in user interface; it has to be real-time — the frame rates must be quick and fast; it must be collaborative; and also, be able to be in full scale so you understand what it really does. Is there anything else that we’ve missed?
Elizabeth: No. That’s it.
Alan: Wonderful. That is a great framework. I’m going to put that in the show notes for people because I think that’s really important. One of the interviews I did earlier today was with Alvin from HTC VIVE, and what they’ve just introduced last week is a multi-user experience, up to 40 users at once using only four trackers, and they can do up to 900,000 square foot space.
Elizabeth: That’s awesome.
Alan: Right? And these are tetherless, so these are the VIVE Focus, which are the standalone headsets. Now, knowing that, how is that going to change how enterprises use this? You don’t need a powerful computer anymore; you don’t need a backpack. You just put on his headset, you got hand controllers, six degrees of freedom, and you can walk around in a 900,000 square foot collaborative space. How would that have changed the work you were doing there, and now do you think that’s going to impact the work that people are doing around the world?
Elizabeth: That is amazing, and I think it will have a game-changing effect. For what I was doing at Ford, and what I would recommend for a lot of enterprise, I think it should be used with caution, and be used judiciously. So, what I could see this being really good for is… there are times when a product gets to a certain phase, where you build one. So let’s say you build a prototype of an airplane, and you have this whole plane in its entirety. Maybe you’re looking at inside the cabin, and you’re looking at the issues associated with the passengers in-cabin.
I can see, for aerospace, doing interactions with a group of passengers and doing roleplaying and, like, storytelling with multiple people, and really going through a whole scenario about the usability and the ergo and UX concerns for in-cabin experiences. I can see it for different people that have different functions, being able to all say things, be together in the environment and then point out their concern: kind of sharing the ball so-to-speak, and working through issues, just like you would if you were standing at a physical model, or working together in the physical world. I think the cautionary note is that, just like meetings when everybody’s talking, there will be some rule — Robert’s Rules of Order — for immersion. I think, now, that these possibilities exist for us.
Alan: I think that’s a really good point. I think some of the social VR things, like Altspace and Facebook Spaces and these things, they’ve actually built in certain protocols that you can basically silence everybody, or silence people that are kind of outside of your purview. The other thing they’ve done, and this is something that nobody really would have thought of, is creating like a personal space bubble. So when you’re in VR, you can walk right up to somebody and you can actually walk through, them because they’re not real. I think the personal space issue is real, and I think it’s only when you’re in a virtual space and somebody walks up really close to you, it’s this freaky, “hey! I’m actually here! What are you doing?”.
Elizabeth: Right. Exactly. Exactly! And we’ll need to… I think there will be many awkward immersions while we figure out how these things should be deployed, and how we work together and share. And it’s a net gain. We just need to do it properly so that VR doesn’t get a bad name again, for another whole set of reasons.
Alan: Absolutely. And I think that leads me to a question, and this is interesting for the listeners, I think, to understand where we are in kind of… let’s say, zero being 20 years ago where it didn’t really work — even five years ago, when we didn’t really have VR — to where it’s like, completely ubiquitous and everybody uses it, it’s used in every company for everything. Where do you think we are on that zero-to-10 timeline, let’s say?
Elizabeth: For VR use in deployment and enterprise?
Alan: VR and AR – all of the XR technologies.
Elizabeth: XR tech in enterprise? I think it’s more… we’re probably at a 6 for VR, and 2 for AR. That’s what I would say. We have a lot of capabilities to yet conquer; a lot of data that still can be embedded. So I still think we’re more at the beginning of this journey than solid implementation. Years ago — not very many, maybe two years ago — I brought a gentleman into Ford, and he was a supplier. He said, “you know Elizabeth, usually when I come to a big enterprise like this, I meet the VR person and then they take me into the basement, to a room with a bunch of cables and headsets, and that’s where they hold these evaluations.” And that really struck me, because I think he was nicely complimenting the VIVE technology that we had at Ford. But the thing about it is, is I still think, in some regards, we’re there. And we need to get to the point where the immersive paradigms… all XR is part of the platform by which companies do product development, and then there is a pipeline so that you can get data easy and you can see it when it’s relevant, and look at it in context.
Alan: It’s interesting. I think Microsoft is really especially pioneering this, with their mixed reality platforms. They’re really pushing the limits of this technology, and they realize the same thing that you just said: if the systems, the new XR systems — you know, Hololens and these — if they don’t work with traditional systems, if you can’t import your CAD models or your BIM plans, or if you can’t instantly use the tools and toolsets that you’re already working with, this technology is not really going to take off. So I think you nailed it on that one. One of the questions I have for everybody is, what is the most impressive business use case of this technology that you have seen?
Elizabeth: Oh. I would say the most impressive use case is looking at a vehicle in the context of all of the possibilities… or, that represent a large amount of possibilities of how it could be built, and then being able to bring that data in — so, representing any potential build condition — prior to the meeting. So in other words, literally walking in and saying, “I just emailed you a spreadsheet, a big CSV file that has all of the tolerance conditions that happen on this car,” apply it to my car, and then look at it. To me, that was the first time–
Alan: That’s awesome.
Elizabeth: –we did that. I was just, like, blown away, because first of all, there was no pre-work. That gets back to my earlier comment about platform and pipeline. It’s built on how the company works. And then second of all, that you can study all of those things, and decide which things are relevant, and then look at the vehicle in that regard. We kind of do a detective work going back, so that you can apply these conditions to that runtime module. That’s just amazing to me.
Alan: It’s fantastic. I think there’s just… I mean, every time I do one of these interviews there’s completely new ways of using this technology. It blows my mind every single time. I tried VR five years ago, and I put it on my head and it was a concert, and I was standing on stage — that was kind of my “aha” moment when I really realized that this is the future of human communications. And for the last five years, I’ve been studying this industry, looking at all the business use cases. Medical, there’s a huge push in medical to use VR for training and for pre-visualizations of surgeries, and for prepping patients to know what’s going to happen, and showing doctors visualizations of surgeries, and pharmaceuticals how they work. And that’s medical.
Then you’ve got engineering, design, you’ve got HR training, now. It’s being used for empathy training, it’s being used for retail… there’s literally no end to this. And I mean, if you look at what you guys were doing in the design side of Ford, you also were doing it on the sales and promotion side, at car shows and showing potential customers what the new cars that aren’t even built yet are going to look like.
Elizabeth: Exactly. I know! And then, you think that the same data that you’re using for development, you can use for marketing. As time marches on and the data becomes mature and then it gets ready for prime time so to speak. That’s just a whole progression, and a certainty in the way that you’re working that is just… it’s really beneficial and it’s highly productive.
Alan: I love the idea of being able to use the data for engineering and that sort of thing, and design, but also then send that over to your developers, and this is something that I’ve been trying to articulate with the workshops that we’re doing as well, is that it costs a lot of money to build a 3D object. Let’s say you want to build a new car: you’ve got to build the seats and it’s quite costly, but once you have that 3D model, you can use it for design. You can use it for sales and marketing. You can use it in AR for, let’s say, Snapchat: you can export it as a different file, you can use it for Snapchat or Facebook. There’s a million ways you can use it. On web, as a web 3D visualizer. There’s a million ways to use these 3D assets. One of the things that I see as being one of the biggest potential possibilities of this technology is, once the world moves to spatial computing, every single object — whether it be a car or a bottle of water — will need to be converted to 3D. And so, there’s a huge push now for 3D artists and graphic renderers to create this digital twinning of the world. And I think it’s a really awesome space to be in, and I’m really super excited for it.
So, my question is, what do you see for the future of XR technologies as it relates and pertains to business?
Elizabeth: For that answer, and related to what you just said, I see a really great integration with AI. A deep-learning approach where what we’re seeing, we’re actually getting data imparted to us, and imparting data to others. So, if you just think about the power, how much are we throwing away right now in enterprise by looking at problems and then not realizing, like, through the pipelines what those problems are? So if somebody — a human — inherently sees that there’s a problem, they can flag it is an issue somehow. That’s a way. But just think if you look at data, and either you flag it as a problem, or the system realizes that maybe the tolerances are off, or the material is wrong, or that just whatever the issue is, and then provides you the relevant information about that so that you can start to solve the problem. Just think how cool that would be.
Alan: It’s incredible. You know, I kind of study futurism a little bit, and I dabble in what’s to come, what’s going to happen when AI and robotics replace a lot of the jobs. But I really think they’re, while they’ll replace jobs, they’re going to create so many more opportunities than… we can’t even think of what they’re going to create. And you nailed it when you said bringing AI into the mix is really part of it. Machine learning, computer vision. These are deep, deep neural networks. These are all parts and parcels, that you can’t really have virtual and augmented reality — true performance — without 5G, without IOT sensors. Being able to walk through a factory floor, and even though the robots are all kind of working away, being able to put on a headset and easily look at the machines and see a green, red, or yellow light above them, and as you walk closer, that red light opens up and tells you the full information about that system. That’s already happening now. But I mean, we’re only scratching the surface of what’s possible.
Elizabeth: Exactly. And that’s why, on that continuum of where are we with XR technologies, I really think we’re just starting, because there’s this whole component. I mean, just think about training a model as you’re immersed.
Alan: That’s going to be crazy. It’s crazy. I mean, here’s the other thing: I just read a paper on collecting personal data, but it’s not from your surfing history; it’s literally your eye tracking, because all the new headsets are going to have eye tracking. They also have positional tracking, so they know your height. They know your gait. They know how you walk and how you move, and they’re collecting body language data in a way that we’ve never ever in human history been able to collect. I mean you can do body pose estimations and stuff, but nothing as accurate as hand tracking and facial tracking. HTC VIVE announced last week they’re even doing lip tracking. So you’re able to capture the true essence of somebody’s intent without them even knowing it. I mean, it opens up some crazy privacy issues. But for enterprise, this unlocks a new level of data set. And of course, it’s it’s unlocking crazy amounts of data that we’re gonna need AI to analyze, but this unlocks so much potential in the enterprise.
Elizabeth: It’s incredible. It’s it really is an exciting time to be in this space. And I agree with you, that although some jobs might be obsolete, others will come. And I think it will provide even more opportunities, and more exciting ways to make a living, I suppose, in the future.
Alan: Absolutely. I think…I actually need to go back in my notes here for a second, because something that I read this morning… it’s going to give us better ways to work. Being able to, you know… most people sit at a cubicle. But being able to put on a headset, and instead of looking at a 20-inch screen, I can now have 20 20-inch screens around me or one 100-inch screen. And instead of sitting in a cubicle, I’m now sitting on a beach. Just being able to give people a better environment in which to work, I think, is really going to decrease stress.
As things move faster and faster, these tools are gonna give us the ability to do work faster, which is more efficient. And I think everybody is kind of running on a treadmill, trying to catch up. But I think these tools will really give us the opportunity to catch up, and get ahead, because we’re entering in — as you know — we’re entering the exponential age of humanity and we don’t really know what’s going to come out of this.
Elizabeth: Exactly. It’s so true.
Alan: 10 years ago, app developer wasn’t a job. Now it’s everybody’s job. And in 10 years from now, is coder going to be a job? Or is code going to code itself? We don’t know.
Elizabeth: Right. Right! Exactly! It’s fun times. But you’re right about better ways to work, and better environments. I think, if you look at the progression of the change of work, for a lot of people, their office is at home. And I think the immersive paradigm will allow people to have their office be their home, but then also be able to be connected virtually with somebody else who’s also, maybe their office is their home, and maybe now they’re in a collaborative session, or maybe they’re looking at the product that they’re analyzing in some regard. I mean, it’s so much better.
Alan: I can’t wait. So, I’m going to ask you one more question, and then we’ll wrap up. For the listeners that are just learning about VR and AR and MR for the first time, what is your practical advice for somebody who’s looking at this for the very first time, to get started in this technology? What is your recommendation for them to just start working in this technology?
Elizabeth: So I would say, tackle a problem that you know needs a solution. Find something in your organization that is a persistent issue, that is really tough to solve with the standard practices that are used to tackle that problem currently. And then apply some form of XR to that problem and show it for the benefits that you’re getting out of it, and show how the XR platform can be used to understand deeply a problem, and communicate effectively between the different disciplines.
And then tackle things on a case-by-case basis for a period of time, and then build up a library of related cases that you can start getting people in your enterprise to see that these things are… they’re not just disconnected things. They actually relate to how we’re making our product, and we have these valuable ways of gaining insight that we did not in the past.
Alan: Well, that is some sage advice from VR pioneer Elizabeth Baron. Thank you so much. I’m just so honored to have you on the show, and I hope our listeners learned a lot on this. We learned about the Tenets of Immersion; the rules of order. It’s been a fantastic conversation. Thank you very much.
Elizabeth: You’re welcome Alan. Thank you very much for having me. It was a pleasure to have this conversation.
Looking for more insights on XR and the future of business? Subscribe to our podcast on iTunes, Google Play, or Spotify. You can also follow us on Twitter @XRforBusiness and connect with Alan on LinkedIn.