Brought to you by MetaVRse

Making AR Focals Functional and Fashionable, with North’s Stefan Alexander

Making any sort of head-mounted AR display has been a challenge, both on the technology front, and from an adaptation standpoint. But Stefan Alexander from North challenged himself even further – by making them look chic, to boot.

Alan: Hey, everyone. Alan Smithson here, with the XR for Business podcast. Today, we’re speaking with Stefan Alexander, vice president of Advanced R&D for North, the company this created Focals, the world’s first consumer AR glasses. And of course, they’re also a Canadian company. And we’re really excited to talk about their new product, North Focals 2. All that and more, coming up next on the XR for Business podcast. Stefan, welcome to the show.

Stefan: Thanks, it’s great to be here.

Alan: It’s my absolute pleasure. As you know, I have had a pair of North glasses for almost– actually over a year now. I was one of the first 100 people to be lucky enough to get these. I went in for my fitting in Toronto, got these wonderful glasses, I got my little ring. And I proceeded to try all sorts of different things. And super excited to have you on the call and really learn more about what’s coming up next for North.

Stefan: Yeah, great.

Alan: Maybe you can just describe the North glasses to the listeners, and how they came about.

Stefan: So, I can give you a kind of brief history of how this whole thing started. So originally when North was founded, it was actually called Thalmic Labs, and the product was a gesture control armband. So this kind of went on your upper forearm. You could make motions with your hand and it would detect your muscle movements and you could control computers, music, do presentation control. But one of the things that they were really passionate about was controlling heads-up displays like Google Glass, which had just come out at the time. I was actually the first person hired to not work on the Myo, that was this armband. And about a year before it came out, they hired me and they said, “Stefan, we think that the control of smart glasses and the control of head mounted displays is really important. But we’re not sure if anybody’s going to make exactly what we have in mind, what we think is gonna be so big, which is glasses that look exactly like regular glasses. And we don’t know the tech to do this, and I don’t think it exists yet. But can we work on a way to do these type of smart glasses?” So I had a display background. I was working at OLED displays. And so I started this research program, that turned into the first generation of Focals. And eventually it got so good that it really just took over the company, and we stopped doing the gesture control and we’ve just kind of went all in on smart glasses and changed our name to North. And that’s kind of how we ended up where we are.

Alan: That’s fantastic. I wonder– you started off life as a gesture armband. And it’s funny, because I remember this. I was part of the Ryerson Digital Media Zone at the time. And I went to Communitech, which is where North was founded, or I guess Thalmic Labs at the time.

Stefan: Yeah, yeah.

Alan: And I remember going into this tiny little lab with I think there was probably 10 people at the time, and they said, “Hey, try this thing on your forearm.” And it was this kind of stretchy, almost like a bracelet with a bunch of black sensors on it. And after that, I went on to create The Emulator, which was the see-through touchscreen DJ controller. And we worked– we ended up working with Armin Van Buren, who was also working with the Myo wristband. And the way they were using it — which was really cool — is they had one on each arm, and he was able to control the visuals onstage, by just simply reaching up and kind of extending his hand, and that motion of extending his hand created fireworks that came off or triggered the fireworks.

Stefan: Right.

Alan: It was kind of that moment went, wow, that’s amazing. And then I got into VR and AR. And then, of course, this Canadian company comes along with this pair of glasses that look literally like just a normal pair or glasses, the arms– everything just looks like a normal pair of glasses. And when I first got my pair, my daughter’s 15 — she was 14 then — she put them on. She said, “These glasses are amazing. They’re really lightweight. They look like normal glasses.” She said, “I am going to give them a 9 out of 10 for the fashionable part of it.” But functionally, they left a lot to be desired. There was a very small viewing angle and I think for a first pass of a product it was pretty damned impressive, being able to have an ability to see something at, let’s say a meter or two meter distance and have like a full heads-up display in front of you. So for those of you who don’t know, these glasses have a little tiny– they’re monocular, so it’s only one eye, but it’s a little projector inside the arm projecting on a little — I guess — projection film, I guess. Would it be– is it a film or is it a–?

Stefan: It’s a special film called a holographic film. And the light kind of bounces out of the projector. And it doesn’t just reflect towards your eye. The hologram actually kind of focuses it towards your eye, too.

Alan: Yeah, it’s super cool. I mean, the amount of tech that’s gone into this is just mind-boggling So, walk me through some of the use cases, because I know when I got them, I don’t think– you couldn’t do very much, actually. You could see your messages, you could see your calendar, and that was about it, at the very least–

Stefan: There’s some maps and navigation, you could call an Uber.

Alan: Yeah. Oh, yeah. Oh, the Uber thing was amazing. Oh, my goodness. Imagine you’re calling Uber and you see the heads-up display and it’s– rather than checking your phone every five seconds to see if your Uber’s there, it would just alert you in a little pop-up thing, and your Uber’s here. That was a really cool– I actually use that feature. And then, the one thing that I tried to use on stage last year — but it just wasn’t quite ready — was the ability to see your presentation notes. And I actually did a presentation last year with the Focals on, trying to make this work. It was a talk I did at the Miami VR Expo. And I gave the whole talk with my show notes in the glasses, but I just couldn’t synchronize it. But that’s now a feature, right?

Stefan: Yeah, it’s totally working. I actually did it just on stage about six weeks ago at Photonics West. So I gave a talk there at a kind of optical conference, and I was able to control the presentation and look at my notes using Focals, which was pretty– it was pretty cool.

Alan: That is very cool, and then the great thing about is, nobody knows you’re wearing AR glasses, because they just look like normal glasses. And I even got the clip-on sunglass adapter and everything. The one thing that I just was amazed by was the little ring. The ring that controls your interface. And how did that come about?

Stefan: So we had a pretty high bar, starting off as an interaction company. So if we were going to do smart glasses, the control had to be amazing. And the whole philosophy going into this — and this evolved a lot, but this core tenet never changed — where it has to look exactly like regular glasses. And the bar for what that was kept getting higher and higher in our minds. It’s not just enough to look like regular glasses. There are so many aspects that are so important with this that we ended up learning and getting mostly for generation one. And I feel absolutely, completely for generation 2. But one of those was, if you’re going to interact with it, it can’t look strange, it can’t be unnatural. There’s no point in having something on your face that looks just like regular glasses, and then you’re around a bunch of other people and nobody knows you’re wearing smart glasses And then you have to reach out and touch your face and touch some buttons, or you have to wave your hands in front of your face, or you have to talk to your glasses and give them voice control. All those things are fine if you’re on your own. But then if you’re on your own, it doesn’t really need to look like regular glasses anyway, does it? Like, you don’t care what it looks like. It’s when you’re around other people and kind of wearing it all day. So the interaction had to be completely natural. And the ring, the thing I love about it is it just seems obvious. Like you use it, you’re like, “What, I have to wear a ring, really?” And then you put it on and you control it, and you think, “Oh, this is actually really cool. This is a really great experience.” And really, really great interactions should be like that. But we have a human-computer interaction group and we tested so many different aspects of control. We tested wristbands. We tested scrolling on smartwatches. Obviously, we tested kind of moving around with head movements. We tested things with eye tracking for a variety of reasons. All of those didn’t work all that well. There’s actually something that tested surprisingly well — we didn’t ship it — but you kind of moved your head around as the cursor and then you clicked your teeth in order to activate something.

Alan: No way!

Stefan: Out of all of the things that I listed, it was actually pretty high. It was surprisingly good experience. People were like– because it actually worked. It was pretty subtle. I mean, at the end of the day, the ring was better than the teeth clicking interface. But it was just– We did so much cool stuff. And the ring, it really stood out over anything else, because it was just so intuitive and natural.

Alan: It is very intuitive. I have done literally thousands of XR demos from Hololens to VR, you name it. The one thing I did find it with the Focals was, the first thing is nobody could see the visual spot. They had to kind of aim it. The fact that the field of view is small, that was one challenge. But once they got it and then the ring, I didn’t– you’d say, “OK, press the ring.” And then nothing else. You don’t have to say go left, go right. It’s a joystick ring. And once they realize they can click it and move it, you don’t have to say anything else. They just– the navigation was super simple and real time. It felt right. I’m just going down the North site. And for those of you want to learn more about it bynorth.com. And one thing I totally forgot about is you guys have microphones built into it as well. So you can actually use Amazon, Alexa to help you with stuff.

Stefan: I’d say the integration is pretty cool. And–

Alan: It really is.

Stefan: There’s a whole trend in smart speakers now — which I think is great — to actually have a screen on them. And you would think, “Why do you need a screen if you have a smart speaker?” But there’s actually a lot of times when the result that you want back is best displayed as something visual, instead of something audio. So we have that integration in the glasses. If you ask Alexa something, you ask for the weather, for example, you will get a visual notification of the weather on the screen, which is awesome, which is better — sometimes like for a five day forecast — than having Alexa read it out for you.

Alan: Absolutely. And one thing that stood out again for a V1 glass with a tiny micro-projector. And you look at things like the Intel Vaunt, for example, which I believe North ended up buying the patent portfolio on.

Stefan: Yeah.

Alan: But it was monochromatic. So it was monocular — meaning in one eye — but also monochromatic, it was just red. But the North glasses, you guys created a full color spectrum display, which– how did you make a tiny projector small enough to give a full color display bright enough to be used outside? It’s just mind-boggling, the sheer science that has gone into these glasses.

Stefan: So this is where I have to give her a lot of credit to the CEO and the founders of the company, and kind of everybody else around us then. Because I– we really, really wanted glasses that looked like a regular pair of glasses. And that was the most important thing. And there was just no display technology that did it. This whole kind of laser projector and hologram thing, we had to build that ourselves. It didn’t exist in something we could just buy and stick in the glasses like with normal micro-displays. So when we’re going through it and trying to figure out how it would work, there was a lot of things with it that just weren’t working. And one of them was– so first thing I said was “It’s going to have to be lower resolution.” It’s text. It’s not super HD resolution. It’s not going to be 1080p, it’s not going to be 720p. It’s gonna be lower resolution. But I think we can make the display look amazing. I think it’s much better to have it really bright and a high quality display, but we can’t have a lot of pixels. So that eventually was accepted. And the other thing I said was “Do you know how much easier it is to just do red?” Red lasers are awesome. They’re the best types of lasers, they’re the cheapest. And we can just save so much space and cost and complexity in the whole system. I don’t even know how to make the hologram work with multiple colors. That’s really hard. And from working in displays, a monochrome display is hard. Color display that has three different colors — red, green and blue — you would think it was three times harder, but it’s not. It’s usually 10 times harder. It’s so much harder, because of the patterning, because of the fact that you have to overlap these colors, it’s just– everything gets so much harder.

Alan: Well, you’re also starting to see things like the Hololens 2, with their color banding issue. You’ve got Microsoft — who’s invested billions of dollars in Hololens 2 — and they’re even struggling with some color banding issues.

Stefan: Color is just– it’s so hard. Just the perceptual science behind it, and the calibration– the display calibration is going to be ten times more complicated. It’s already going to be hard. And so that was something, though, that they just didn’t give up on. Like it’s fine, let’s prototype some monochrome ones. But it needs to be color. It’s a consumer product. People need to love these. They need to feel like the display is responsive and alive. And something that’s monochrome, even though technically you can list all the use cases, and you can’t show me a use case that actually needs color. But the feeling people get when they use it, it’s so important with color. So I lost that argument, but at the same time we kept working on it, and we came up with a way to actually do color in there too. And I think it was– it was after I lost the argument, actually. So I’m like, “Okay, I still don’t know how to do color, but I’m going to keep trying on it.”

Alan: “We’ll figure it out.”

Stefan: Yeah. And we’ll figure it out. And we *did*. And I love the way that it looks, the color.

Alan: It really does look amazing.

Stefan: And it did make the temple arm slightly bigger. However, for the second generation, we figured out a way to reshrink it again, and all of the space that we allocated to color is actually now even smaller. So I’m so glad we put the color on there.

Alan: That’s incredible. I want to talk more about the user feedback. I’ve had my pair since the very beginning and I have to be honest, I don’t wear them all the time. It’s mainly because when I got fitted for them– and I want to talk about the fitting system has also changed dramatically for you guys, because when you introduced these, they needed to be fitted to your face. And you guys created a quite elaborate 3D laser scanning system, where you’d go into a room, this laser scanner would go around your head, create a 3D model of your head, and then fit the glasses to that. Since then, the iPhone 11 came out with IR tracking on the front face or the back-facing camera — front-facing camera — and allowed you to scan people’s faces with just an iPhone.

Stefan: Yep.

Alan: So is that something you’re moving forward with, so that you can now expand these fitting locations, because you were tied to this elaborate fitting station before?

Stefan: I think the way we started off with this is, they need to look like a regular pair of glasses and looking at it, the adjustability of a product — like a one-size-fits-all — you make a lot of sacrifices in form factor, it just drives a lot of space in there. And also the eye box, so this is the property of the glasses that show how much can your eye move and still see the display. And the trick with an eye box is that — especially with this not a very wide screen — the amount that your pupil actually moves as you’re scanning around the display, it’s actually not that high. But the amount that you have to allocate to — let’s say — misalignment or slipping or everything with the glasses. That’s where the majority of the eye box comes from. Which is why you can see it reasonably well when it’s on you. But if you put it on somebody else, they can’t see it at all. So we said, well, eye box drives a lot of space. So what if we had a small eye box? And what if glasses weren’t adjustable? But instead, we pack this eye box and adjustability technology. Instead of putting them in the glasses, let’s just put them in a measurement machine. Let’s put them in our process flow. Let’s build it into the rest of the business. If you want to make the glasses as small as possible, I think we just need to become a glasses and a measurement company. And that way, people also get a beautifully fit pair of glasses, perfectly adjusted for their head. So that was kind of the original philosophy on how to do that tradeoff, and how we ended up in the sizing in Gen 1. And you know, it worked. It works well. I think there’s two big areas where we still didn’t like that. And one is you have to go into a store. It was awesome if you could go into a store, because it was a really cool experience.

Alan: And the stores were nice. Come on, they were really beautiful.

Stefan: Yeah, I loved the stores, the whole retail experience. It was so good. But if you weren’t there, and there wasn’t a mobile sizing truck that we kind of brought around to, then you were stuck, you couldn’t buy the glasses. So that wasn’t great. The sizing app certainly helps with that. So it totally resolved that. If you didn’t have an iPhone 10, somebody else could have probably also– like you could have bought it from somebody else or something like that. So that increased the availability. That wasn’t that bad. But the real thing that actually held us back there is, what if you wanted to give them to somebody else to try on?

Alan: [laughs] This is my exact problem.

Stefan: Yeah, you’re so excited about them. We hear this from so many people.

Alan: Oh, my God. I can’t even– I’m going to explain it to you.

Stefan: Yeah!

Alan: I’ve got to explain it as a user and somebody in the XR space. I get new technology. We’ve got a Magic Leap, a Hololens, we have all the tech. And we bring it to events and we show people, “Hey, this is the future.” And I love my North glasses, until I brought them to an event and let everybody try them. And then they didn’t work anymore. They didn’t fit me. I had to go get refitted again. So the ability to show other people was hindered in that way. And that really– it was a stumbling point for me as a user, because you want to share this technology. You know, we’re all a bunch of nerds.

Stefan: Yeah. And it’s yet– it’s hard when you can’t do that, because you have this cool experience and then nobody else can. They have to go into a store and get a special custom demo, just in order for you to show them something cool. So then what we wanted for Gen 2, and it was also I think– if people do wear them all day, we said, “OK, they have to wear them all day. They have to be really light.” The glasses were 67 grams. I think it was like almost one of the lightest ones that was out there. And it’s still after maybe six hours or so, some people were fine and some people felt it was still a little bit heavy. Then we had this goal going into generation 2 where we said, OK, how about this as a goal: if you can fit the glasses on your face physically — because they still have to be a properly fitting pair of glasses. Not every pair of frames should go in every head, because heads are different sizes and glasses have to look good — but let’s just say if these glasses can fit on your head, and they look okay and they’re not squeezing or pinching you or falling off, then you can see the display. So if you should be wearing the frames, if they look half decent on you, then you have to be able to see the display. So that was one of the things. The other one is temple arms have to get smaller and it has to get lighter. It has to get to 50 grams. And the only way to do that is really to shave off space and shave off actual components and cut the battery, cut the power consumption. So we had to make it smaller and we had to cut power. And then we also said, I mean, wouldn’t it be nice if we could show a little more text on there? Sometimes you don’t want a lot of text, which is great. Like with a message or something like that, you don’t want to be distracted. But let’s see when you’re watching your presentation notes, you don’t want to necessarily have like six or seven lines of text. It’s nice to be able to see all your text in your speaker’s notes, so you don’t have to scroll as you’re trying to talk at the same time. So what if we could fit a little bit more on the display, too, at the same time? And as an engineer, you do this work and there’s no new fundamental physics that we’re doing in this. It’s just– it’s technology. It’s combining things, it’s understanding tradeoffs, understanding what you can do. But it doesn’t matter what you can do, it only matters what you can mass produce. So if you come up with some incredible new RGB laser and you can embed it in the glasses itself, if nobody can build it, it doesn’t matter. So you’re pretty constrained. So everything is just managing tradeoffs. And so going into Gen 2 where you said, OK, well, the whole reason why we did the sizing thing is to keep a small eye box to make the glasses small. So now you want to make the eye box so much bigger — like 20 times bigger — and then you want 10 times the resolution, *and* you want it smaller at the same time. And it should cost the same price, and we should cut power consumption. I’m like, what are you giving up? You can’t–

Alan: I was going to say, what the hell, man?

Stefan: Yeah, technology gets a little bit better, but you can’t– you don’t just get to do this from one generation to another one. Things are incremental. We can use a slightly faster processor and save 20 percent power, but you don’t just get to do this from one generation to the next. But–

Alan: *But…*

Stefan: The team really came together. But we actually managed to–

Alan: Patent portfolio.

Stefan: Yeah.

Alan: And the company has raised in excess of $100-million.

Stefan: Yeah, yeah. In terms of US dollars, it’s over 150 at this point.

Alan: Holy moly.

Stefan: So it cost a lot.

Alan: That’s like a 100 million Canadian dollars.

Stefan: [laughs]

Alan: [laughs]

Stefan: Yeah, yeah. It’s getting– it’s not quite, but close to like 200 million Canadian dollars.

Alan: Yeah, sorry, two hundred million Canadian. Yeah, that’s a lot.

Stefan: It just– it costs a lot to do this development. So we managed to have an entirely new architecture. A lot of the stuff we were working on for Gen 1 that we kind of– wasn’t quite ready yet, and then some other breakthroughs. And we managed to have a generation 2 that is everything: smaller, high resolution, much lighter. And the greatest thing is, if you can put the glasses on your face, you can see them. So you can have one pair of glasses and 20 people can see it. And that part of being able to show people in the demo. The portion that people saw the Gen 1 and who wanted to see the Gen 2, the most surprising thing to them, the first thing is, “Oh, I can see the display. This is so cool. Like, it’s just here.” There’s never an issue there.

Alan: That’s gonna be a game changer, because doing demos and having people go, “I can’t see it.” And that’s just like “Eh.” [chuckles] So, yeah. This is fantastic.

Stefan: Definitely credit to the team and also just the whole– even once Gen 1 was in the engineering phase, I had a whole team and we had 15 or 20 people way before Gen 1 launched, and we were working on Gen 2 technology. The beginnings of this was almost three years ago in terms of the Gen 2. So we’ve been working on it for a long time. So even when the Gen 1 came out, we said tech takes a long time. So let’s keep going with this technology pipeline now that we really, really know what we’re doing. And the team really came together and very deeply understands this. And another awesome thing is the suppliers. They believed so much in what we’re doing and this product line, they really loved the Gen 1 and participating in there. And so all of them make a lot of custom components for us. We do a lot of custom co-development with our suppliers, because they believe the space — like us — is going to be huge. And so we couldn’t have done that without those close collaboration, too. Because, again, you can come up with something cool, but if nobody can manufacture it, then it doesn’t matter.

Alan: Yeah. And if you look at just the engineering side of the hardware, I mean, that’s spectacular on its own. And really, when it comes down to it, though, people once they put it on their face, once they buy into it, the engineering side, the hardware side, the ring, the glasses, the eye box. None of that shit matters at all. What people want to know is what can I do with it, and how can this help my life, and how can this help my day to day? And one of the things that I found as a user experience that was incredible — and you guys thought this through — was as I was walking down the street one day, I’m talking to somebody and my alerts kept coming up. And when you’re looking at the alerts, you’re looking straight through the glasses. And if you’re looking at somebody on the other side, it’s really strange when you’re looking at a message while looking at somebody. It’s kind of like you’re staring at them, but you’re off in space, you’re not really there. And so one of the interactions that I realized you guys put in after is, if you’re having a conversation with somebody, it doesn’t show you alerts. And I don’t know if it’s through the voice, through the microphone or whatever. But that one simple user interaction — or taking that away, really — was amazing, because if I’m in a face-to-face conversation with somebody, the last thing I want is my messages popping up. And you know how annoying it is when people look at their Apple Watch when they’re in middle of a conversation, you’re talking and then all of a sudden they’re looking at their Apple Watch, and they just derailed your whole conversation.

Stefan: Yeah.

Alan: That’s what you don’t want with these glasses. And you guys have solved for that.

Stefan: Yeah. And this is– it comes down to that core tenant, too, of looking like regular glasses. And if you can have these glasses that look just like regular glasses, and the display is invisible and you can’t tell that somebody is interacting with it, because their arm is down by their side and they’re controlling the loop. But if they have this stare, this look where they’re clearly looking at you but not paying attention to you, then it just ruins the whole thing, too.

Alan: It is such a weird phenomenon, I’m not going to lie. I’ve done it on purpose to people, just standing there looking at them and they’re looking right at you, but they’re not, they’re looking right through you or like at the display that’s kind of above your head.

Stefan: Yeah. What you’re talking about is definitely one of those magical moments that our users really liked and they wanted more of. And the other thing that people love. So, you know, in terms of how people are using this, it’s about half of all of the uses of it are when the glasses prompt you and say, hey, you have a notification, a message from another person, a message from an app, notification for an app that you have an appointment coming up, or something like that. So half of all the times when the users use the glasses, they don’t even initiate it. So we have to be so good at knowing when they want to use it, and when they don’t want to use it. And when you get that right, it’s amazing because they get information and they’re not even thinking that they might need it. So it’s just the right time. But if you do too much, then they’re gonna shut them off. Then they’re not going to use them and that’s going to be frustrating. So getting that right and understanding– and what you talk about, we call “context”. So what is the user doing over the environment? Who are they with? We have done a huge amount of extra work on context, both in sensors and the hardware, but also in very intelligent artificial intelligence algorithms, to detect more and more of your environment. And the other thing we use context for, so this is again, when people do click the ring and they bring up the glasses, which is about half of all interactions. Half the times they click the ring, they see exactly what they need on the home screen and they never go anywhere else, because the home screen is contextual, too, because the home screen knows you have a calendar appointment coming up. Or you’re playing Spotify music, you might want to see your music controls and the song. Or this is kind of cool, I don’t know if you used them recently, but if you have a flight booked and you’re at an airport, then it shows you, here’s your gate, here’s where you need to go.

Alan: Oh, man. That’s like just that one feature alone.

Stefan: Yeah.

Alan: “Where the hell am I going in this airport?”

Stefan: It has a whole contextual airport experience when you’re there, too. And so being able to get it right. When people click the glasses and they bring up the home screen. They don’t even have to go anywhere else. We’ve shown them exactly what they wanted to, just with that alone. And then they close it down when they click the glasses. And another half of the time they’re going to scroll through and they’re going to choose the right application. So that’s still pretty quick. But the bar, the thing we want to hit is that magical experience, where the glasses either tell you what you want to know, or when you ask them something, they know exactly what you want. And being able to do that with a whole new variety of sensors that we’re using. And a lot more intelligence and just guessing what people want and giving it to them. This is– it’s totally doable, from a technology perspective. And I think the people are going– I think everyone’s going to love it with Gen 2, that’s one of the things that we’ve done the most with Gen 2 in terms of the user experience. So I’m glad you loved that conversation detection, because that’s basically the direction we want to take the product.

Alan: I think this is more than a hardware product. This is more than a pair of glasses that you can put some data on. This is a new way to interact with data around you. And I love the simplicity of it. It doesn’t have front facing cameras — I don’t know, maybe Gen 2 does — but it’s not trying to world map your whole view, it’s just– it’s like having an Apple Watch in a more convenient place. It’s like having all the data that you have in your watch, where you don’t have to look at your watch or pull out your phone. It’s really wonderful. I have to ask, on behalf of everybody listening, when can we expect V2 to drop?

Stefan: So it is going to be still on track for sometime in 2020. There’s going to be definitely a gradual reveal here. So it’s not just– we’re going to be quiet and you’re not going to see anything. And then they’ll just appear one day. There will be opportunities for people to see more. Maybe some previews. We have a lot of units that we’re using right now in testing. The engineers are testing units for functionality. The production people are testing the mass produce ability and reliability. The product people use them to develop software. Everyone’s wearing them regularly, in terms of our employees and some initial beta testers. And more and more will be revealed even before they actually go on sale. People will definitely be able to find out a lot more of them and hopefully even be able to get some demos.

Alan: I would love to be part of that beta test. Hint, hint, nudge, nudge.

Stefan: I’ll pass that along.

Alan: [chuckles] Awesome. Is there anything else you want to share with everybody before we wrap this up? Because this has been fantastic interview here. And I think there’s so much to unpack. And the North Focals 2 are coming this year, in the year of 2020, which obviously makes sense. So I think people are going to get excited. And you said the similar price point around the– I think– what are the other ones, $700?

Stefan: Yeah, I think it was $599, the base price. And then $699 with prescription glasses. I’m not sure of what the price is going to be for–

Alan: So under a thousand dollars, for sure.

Stefan: Exactly. Definitely not substantially different.

Alan: Incredible success story coming out of Canada, which is cool. And I have to ask you, living about 40 minutes away from the office, can I come for a tour?

Stefan: We are doing stuff like that, once in a while. So I think we should get in touch with production and marketing people, and see if we can arrange something like that. If not, not a great time the next two or three weeks.

Alan: [laughs] Obviously not now.

Stefan: But I actually actually think soon. I think that be a lot of fun.

Alan: That’d be great. I would love to do it and share the experience with everybody and show people what does a company developing the future of digital eyewear look like from the inside? What is the research lab, and what are the people doing there? I had an opportunity to go to the Magic Leap head office and tour that, and unfortunately I can’t say anything or show any photos. But it was an amazing experience to see just the sheer numbers of people working on different parts of– things you wouldn’t even think of. You mentioned earlier, just the voice detection, the AI. There’s so much going into this, that’s beyond just a pair of glasses with a display in them. And I think that’s what’s really important is that we’re driving– you guys are driving the future of of human interaction in augmented reality. Thank you so much, Stefan, for being an industry leader and a pioneer in the space.

Stefan: Oh, you’re welcome. It’s so much fun.

Alan: With that, I can ask you one last question and then we’ll wrap it up. What problem in the world do you want to see solved using XR technologies?

Stefan: That’s a great question. I have definitely thought about that from a lot of different perspectives, but in terms of a problem that wants to be solved… I think we have this issue right now, where– it’s so true now, especially with the current health situation, where people are having to be distant socially from other people. And we always have this conflict with our devices right now. Do I look at what’s in front of me. Do I pay attention to this person that’s in front of me? And do I kind of have my awesome experience of the real world? Or do I go into my digital world, because I have a lot of information to manage there. I have a lot of people who I can only stay in touch with digitally at any given time. I have to choose right now to look at my phone, to look at my watch, or to look at the real world. And every time I’m doing one, I’m not doing the other. And I think there is this potential to not have that conflict anymore. Somebody that doesn’t have to feel like they have to put their phone on vibrate or lock it in a safe or something like that in order to be present. What if you could be present and connected and only get interrupted with what’s essential, and not get sucked into the phone vortex that you have when you go to check a notification, you went up an Instagram for an hour? But what if you could have all the benefits of the connection of glasses, but without being distracted from what’s in front of you right now, and bring the best, simplest parts of your digital world into your reality, in a way that wasn’t in conflict? And I think that people could be more present and more connected with everybody, but also their experience of the real world wouldn’t really be effective. And I think that would just be so cool if it could move in that direction. And that’s– I think that’s what a lot of people at North are very passionate about, is not giving us another tech device, but essentially giving us a way to not have tech be so distracting.

Alan: I love it. And we have to be very careful not to let tech completely envelop us. So that’s wonderful. Thank you, Stefan.

Stefan: It’s great. Well, thanks a lot for having me on.

Alan: Oh, it’s my absolute pleasure. Thank you so much. And thanks, everyone, for listening. Please hit the subscribe button so you don’t miss any future episodes. If you want to learn more about North Glasses, you can visit bynorth.com. This has been the XR for Business podcast.

Looking for more insights on XR and the future of business? Subscribe to our podcast on iTunes, Google Play, or Spotify. You can also follow us on Twitter @XRforBusiness and connect with Alan on LinkedIn.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top