Paul Travers has been in the XR business long enough to remember the early headsets, which were not exactly elegant in design – he describes one of his early models as a football helmet. But today, Vuzix has managed to shrink a ton of XR potential into sleek, sexy sunglasses that would look good on any goth noir vampire slayer. He chats with Alan about the advantages of svelte headsets, from military applications to making driving safer.

Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Paul Travers. Paul is the founder of Vuzix and has served as the president and chief executive officer since 1997. Prior to the formation of Vuzix, Mr. Travers founded both e-Tek Labs and Forte Technologies Inc. He has been a driving force towards the development of products in the consumer market. With more than 25 years experience in consumer electronics field and 15 years experience in virtual reality and virtual display fields, he is a nationally recognized industry expert. He’s joined by Vuzix’s head of business development, Matt Margolis. If you want to learn more about the Vuzix platform and their headsets, you can visit vuzix.com. Paul and Matt, welcome to the show, guys.

Paul: Hey, Alan. Thanks for having us.

Alan: It’s my absolute honor. You guys are making augmented reality headsets that people actually will want to wear. And I think it’s amazing, your Blade glasses look like a pair of awesome sunglasses. They’re lightweight. They’re wireless. They’re every– they’re all the things. How long has it taken you guys to get there? I mean, you started in 1997. You must have gone through massive iterations along the way.

Paul: Yeah, Alan. I mean, we’ve made all the big stuff, the crazy things. They really started in ’93 or ’94 when we started shipping our very first VR headset, the VFX-1. And if you look it up, you’ll see VFX-1, it’s a football helmet sized gizmo. And then in ’97, actually I bought out all the outside shareholders and started Vuzix. A little bit of history there, we started in the defense space. We were making thermal weapons sight engines that go in the back of the light/medium/heavy thermal weapons type programs for DRS and Raytheon. And doing that, we got an opportunity to work with the special forces guys. And if you think about it, these guys are carrying around 300 pounds of gear. They got their laptop. They’re basically the ultimate mobile wearable tech guy. And at night, they would light up like a Christmas tree. So they put a poncho over their head. They had all this gear on and they came to Vuzix and said, look, could you guys make a pair of Oakley style sunglasses? They called it the Oakley Gate. And they said, if we could do that, half the military would buy these things. And so even all the way back then — it was ’97 to 2000 — these Special Forces guys wanted cool. They wanted lightweight. They wanted it truly functional. And so over the years, we’ve come out with a lot of different devices and each iteration we’ve been pushing on, making them smaller and lighter. We were talking a little bit earlier about the top-down versus bottom-up approach. I mean, there’s some really cool technology that’s out there that’s doing all spatial computing and the likes, but it’s big. And for Vuzix, we’re taking the lightweight, trim, wearable all day side of it, but highly functional. When you’re looking for streaming video applications where you’re doing see-what-I-see for maintenance, repair, and overhaul, or you’re in a warehouse all day long taking stuff out of that warehouse. You don’t want a great big, heavy thing. You want a super lightweight device that you can wear all day long. So at the end of the day, you don’t have headaches from just sporting the stupid thing.

Alan: I can totally relate there. The Hololens, while a wonderful device, man, it’s so front heavy and they fixed a little bit on the 2, but these are things that– it’s not acceptable to wear something that heavy on your face for work. It’s just not acceptable and it’s gonna cause problems. So by taking the weight off and creating a device that frankly is sexy. I mean, people want to wear these glasses. They’re awesome.

Paul: Thanks, Alan. I appreciate that. And it’s the keys to success, I think it. If you put it on, in a half an hour you want to take it off, that’s a fail. And most companies will never deploy that. There’s a lot of companies doing experiments, but the ones that are finally getting to deploy are the guys that literally can give it to their employees and they’ll use it all day. And if they do use it, they get an ROI that can be significant by doing it. And that doesn’t need full up spatial computing in many, many cases. You can do a lot.

Alan: Most to them.

Paul: Yeah, most of them don’t. I mean, we’re talking, you’ve got a person. They’re using a tablet or a phone, but they’re mobile workers. They have to use their hands. That’s the spot where Vuzix is at. We are working to help those mobile workers, getting them in a position to where they got access to the information, but they don’t have to hold a tablet in their hand to get it. And that’s the areas where Vuzix is seeing success and starting to see some pretty significant rollouts. And 2020 is gonna be amazing. And I think you’ll see even through the rest of 2019, there’s a bunch of really cool deployments that are starting to happen around these kinds of lightweight wearable computing devices.

Alan: So you you said that 2wenty years ago, the military is like, if you make an Oakley pair of sunglasses, we’ll buy them all. Fast forward 20 years and is the military one of your biggest customers?

Paul: They haven’t been, actually, because — believe it or not — around eight or seven years ago, something like that, we sold our defense division. We haven’t really been in the defense space. However, the partners that we sold it to, we renegotiated their relationship a little bit. We had– we gave them an exclusive, they’re now partnering with us beyond that exclusive. And so in the last eight or nine months, the defense side of our business, they’ve been actually now coming to Vuzix in a big way. There’s a couple of things that Vuzix brings to the table. First of all, in first responder marketplace, our current products are really starting to open up some cool doors. First responders, security markets and the like. And we can share a bit more about that in a bit, but–

Alan: Let’s unpack that for a second, because one of the things you have on the front page, you upset is Vuzix smartglasses get automatic facial recognition designed for law enforcement.

Paul: Yeah.

Alan: That’s awesome. I want police to be able to look at somebody and detect whether they’re a threat or not. That’s a no-brainer.

Paul: It is. And I know that it can be a controversial topic. But if you look at the cross section of America today and look at some of these large venues where somebody shows up sporting some weapons that are designed really for– I wouldn’t say weapons of mass destruction, but, you know, when you–

Alan: Not nice things.

Paul: Yeah, many, many rounds in a minute. You’d like to think that those kinds of folks, there’s some weapons that you can use against them. And for security folks in large venues where maybe there’s 20,000 people showing up for the concert. They give these security guys a book of pictures and say, “Remember these folks.” And they’re the ones we don’t want getting through the gate. And it’s like, really? So what they’re doing now is they’re using glasses, like Vuzix’s glasses with the cameras built in, and/or we have company partners like Sword, who have a separate sensor head, effectively, that works with an iPhone and that transmits the feeds to the glasses. And now you have an AI engine that can help you pick out these people of interest in the crowd. It’s not about recording folks who might be coming in the front gate. It’s about simply helping these guys that are trying to do a good job of preventing the bad guys from getting into these venues. And it’s a great example. We have guys that are doing it with servers that you– wearable computing server that goes on your belt, that averages upwards of a million faces in the database and running in real time frame rates. You know, there’s upwards of 15 pictures in a frame of video. It will, within a second, determine if one of those people is in the database. So it’s pretty good there. And then there’s other folks.

Alan: And if you unpack that just a little bit further, this kind of eliminates personal biases as well. You’re using AI to identify potential threats. You’re not using AI to say, “OK, you look like you’re from Iran. So I should pull you over.” Like, this is actually a much better tool than just giving some photos to some people and saying, here, pick these ones out of a haystack.

Paul: Yeah.

Alan: It’s crazy.

Paul: You’re absolutely right. And I don’t want this to come across the wrong way, but to some people, some people all look the same.

Alan: It’s true. Listen, you get 20,000 people, they all look the same. They look like a bunch of faces.

Paul: Yeah, they do.

Alan: So, being able to laser target people of interest, I think, is big. Let’s move on from– unless there’s anything else you want to talk about with first responders, because I think there’s also some stuff in the medical. Like the first responders from the medical standpoint.

Paul: Well, that’s where I was going to actually continue the conversation. So this whole idea of you’re in an emergency truck and this particular person has a problem you don’t know how to deal with. They’re starting to use our glasses to stream real-time to a doctor. And the doctor can help before the ambulance trucks even gets there with certain treatments. So that– and time counts in these situations when it comes to saving lives, as you can imagine. We’re also doing that same thing with companies like 1Minuut where they’re doing remote telemedicine, where you’ve got a person who has a basic degree to treat people and to nurse people, but they don’t have enough to be able to know whether or not that person should be visiting the hospital. So they’re remotely in the field. And then a doctor will call in and diagnose and say, look, give him a aspirin, we’ll see him in the morning or get him in an ambulance, this is actually the critical thing. So remote medicine, from that perspective and from training, you have a doctor who’s doing an operation and there’s 15 people seeing through his eyes as you’re streaming HD video out of the glasses while he’s looking at the operation in real time. So there’s many, many applications for medical space that are starting to be used around our glasses.

Alan: Medical seems to be that sweet spot, that XR in general — virtual/augmented reality, mixed reality — seems to be a really good use case. And it’s unlocking huge potential to save lives, and that’s really, really important.

Paul: Yeah. It’s wonderful.

Alan: So what are some other use cases of these glasses? So let’s just talk about the different types of glasses that you have first, because I know you’ve got the Vuzix Blade, which are these sexy Oakley like looking glasses. Then you’ve got your more industrial use cases for the M400 glasses. What are the differentiators between the different glasses, and what are the use cases that you’re seeing in the field?

Paul: So we all the way back to when the Special Forces guys asked if we could make Oakley style sunglasses, Vuzix has been working on the optics and the display engine technology to get to that point to where these things can look like Oakley style sunglasses. And I have to say, what we’re doing today is pretty darn awesome for sure. But we’ve got Next Generation Tech — which we can, again, talk about in a minute here — but it’s going to take yet again another step towards cutting the frame sizes down, the look and feel of these getting even sexier. But that has our WaveGuide tech in it. And it does have this sort of cool look and feel. And the optics are different than the M Series enterprise products that we make, in that they’re optically see-through. So when you put the Blade on, it’s like wearing a regular pair sunglasses. But floating out in front of you, just like the HUD on a car or a fighter pilot’s cockpit, images just float out in space. And so they’re real trim looking glasses, Android, everything built into them, but they’re optically see-through. Now, on the enterprise side today, the M Series products, we started with the M100, moved to the M300, which was Intel based, and just recently announced the M400, which is Qualcomm’s XR1 series Silicon Inside. And it uses an occluded display. Now, this is like looking through a camcorder. The thing that is really nice about the M400 is, you’re looking through this thing and the image quality is pitch black. The contrast ratio, I think is 10,000 to 1 because it has an OLED display.

Alan: Wow.

Paul: Yeah, it’s really beautiful. And the camera that looks out the front is a– Matt, is it 12 megapixel camera?

Matt: 13.

Paul: 13 megapixel camera, image stabilized, auto focus. It’s just beautiful. Also, when you put these two things together, working in concert with this XR1 processor, you can do some amazing stuff. Streaming video today on the M300 series — not that I’m throwing it under the bus — but it works really hard to do even a wide VGA stream at 20 or 30 frames a second. The M400, it can do 720p 30 frames a second. Snap, snap, snap. It’s like it’s just beautiful. And it records for a video at the same time. So it’s like having a digital camcorder with 4K recording capabilities. I mean, this thing is a racehorse.

Alan: So, OK, so let’s just stop there for one second. So the M400s have 4K camera front. They have a 720 display inside. So why would you want that? Then I actually interviewed the team at PTC today, which kind of goes really hand-in-hand with this, because one of their killer applications is their Vuforia Chalk system where you can kind of have an expert — like you said, maybe a doctor or a team of doctors — looking over your shoulder– well, not really over your shoulder, but they’re literally looking through your eyes because they’re able to use that 4K camera to project back and give information real-time as needed to the person in the field. And I think this is a use case that’s going to unlock a huge amount of value for enterprise clients, because if you’re in a factory and that machine, whatever it is you happen to be working on, it goes down, downtime can be anywhere from a thousand dollars to millions of dollars an hour. And being able to pull an expert up and have somebody that’s already there on the field not having to fly somebody in, this is huge. And your glasses enable that.

Paul: They do. And they do an amazing job. I would suggest they’re probably the most state of the art pair of glasses on the street today that can do this, because the XR1, the processing, the graphics processing capabilities, everything built in there and that beautiful camera that we have. It just– it’s really hard to compete with this one. And there’s probably 10 companies that do remote support software using our glasses. Vuzix has its own sort of modest one called Vuzix Remote Assist.

Alan: OK.

Paul: Then there are guys like PTC that actually can do the rendering on top of the camera image, to give you a augmented image that does this Chalk thing where you actually circle stuff and the likes and it stays locked there. And in fact, the Vuforia side of the stuff from PTC, you can look at an engine and have the oil filter highlighted telling you that that’s got to come off first. It can put torque specs on the engine and they can be all locked to it in real-time, so you can do the remote assist, but you can also do call avoidance with stuff like that, where you have the glasses on and you’re working it through on your own on this piece of equipment.

Alan: Oh, that’s right. Because, you know, and once you have somebody do an assist for one, because it’s recording everything.

Paul: Yes.

Alan: So you can use that assist as the general assist for anybody that does that. So before they even call somebody, they can help. Oh, wow.

Paul: Yeah. And there’s a lot of companies that their first– they have a lot of equipment in the field, right? And they don’t want to have a 500 tech support guys, all waiting for a phone call. They want people to be able to do it first, so they like the call avoidance side of it. But to your point, think about you’re on an oil rig and on the oil rig, the equipment goes down. That tech can’t just willy-nilly make a fix, right? Because if he does it wrong, the rig could blow up and you end up with another Gulf of Mexico mess on your hands. And so there’s all kinds of protocols that go into the fix that gets done. Normally, what would happen was a $50,000 custom jet helicopter ride out to get the thing fixed in the middle of the Gulf. Then two or three days later, it’s finally back up and running and it’s millions of dollars a day versus being able to do this remote assist call and do the instructions on the fly. You can do it in literally hours in a comparison. So remote assist is going to be a very big piece of business. And it’s anywhere’s from case equipment, big tractors in the field, to companies that are looking at bundling our glasses with their equipment, so that there’s a way to get tech support without having to put somebody on an airplane.

Alan: It’s interesting you say that, I’m speaking at a printing conference this week and my original presentation was talking about bringing print to life with AR and that sort of thing. And as I started to think about it, I was like, these are people that are making printers, big format printers and stuff. They’re not really all that concerned about bringing print to life. They want to make sure that their machines can be fixed fast. And printing is one of those, if anybody’s ever had a unjam, a printer, a complicated printer, it’s a pain in the ass. This is a tool that can give those manufacturers an upper hand in keeping those machines up and running faster.

Paul: Yes, no doubt about it. That’s the remote support side of this. And you can imagine market after market after market where these kinds of things are. The ROI is measured in one use. It’s paid for itself.

Alan: Yeah. Or ten times over. I mean, the cost of the glasses is– the M400 is $1,500. That’s like a second of downtime in an oil rig.

Paul: Right, right. And I think one of the things you should notice here, through many of these descriptions — again, I’m not trying to throw the competition under the bus here — but full up spatial computing just is not required. That’s why we prefer this, the ground up approach where we’re putting the right technology in to deliver an experience that’s required to solve problems today first. Ultimately, we’re convinced this tech is going to shrink. It’s going to come down. It’s going to end up being like the Kingsman style glasses. [chuckles] But the technology, there’s work that needs to get done before you can do that in a form factor that gives you everything that you want, plus has that sci-fi look and feel.

Alan: I have a really great pair of North glasses.

Paul: That’s a step in the right direction in some way.

Alan: A step in the right direction. But the field of view is so small, it’s actually not that useful.

Paul: And it has a pupil that’s so tiny that if your eyeball gets moved off the glasses in any direction, you lose the image and–

Alan: I had to go and get them refitted the other day, because they get them and you start showing people, and people got big fat heads and stuff, and all of a sudden I put them on, I can’t see anything anymore. There’s a very, very small sweet spot where your eye has to be perfectly aligned with the image. I mean, that’s not useful for enterprise, at all.

Paul: There’s mix. There’s a nice mix between field of view. The one size fits all side of it. And the technology that does full up spatial computing, which is big, bulky, all in thing. So there’s the right spot to be where it’s highly functional, but it’s also highly wearable. And that’s where Vuzix is pushing to be.

Alan: Right in the middle. And that’s the sweet spot.

Paul: Yes.

Alan: So let’s talk about the Blade then. Because those things are– what is the difference? So the Blade, you’re kind of able to see right through?

Paul: Yes. Well, so– if you wouldn’t mind it, let me take a step back to the M400.

Alan: Sure. Please do.

Paul: We talked about a few applications there, like the remote assist, the remote support in the whole world of logistics there’s big opportunities coming here also. The world of brick and mortar is– every other time you turn around, another Sears is going out of business. And it’s because of companies like Amazon that are out there, and everybody’s buying online and using FedEx as the logistics partner. But there’s a lot of brick and mortars, if you think about it. And North America alone have thousands and thousands of stores that effectively are an amazing distribution channel already. And devices like these glasses can enable employees in those stores to become pickers. So guys like some of these big retailers are getting themselves in a position to where they can compete with the online guys because they have distribution already in hand. They just need to turn their stores into picking warehouses.

Alan: Wow. That’s an amazing use case.

Paul: You are going to see a lot of it coming up. See, these companies aren’t all rolling over to Amazon, frankly.

Alan: No, of course not.

Paul: And that’s again, you can use a form factor– and in fact, in some cases, they want kind of this technology looking form factor, so that when people come in the stores and see people picking, they want them to be perceived as an advanced sort of forward looking companies and those kinds of things. So the M Series products has a bunch of things in enterprise that range from warehouse picking, work instructions, remote support, right on through to people even turning around aircraft at the airport. There’s many, many applications that are coming, pretty exciting. And with the Blade — nice roll into that here — it has this look and feel that’s starting to be a normal looking sunglass style design. And it delivers an experience much like the original videos that Google came out with for Google Glass. It’s got this nice field of view out in front of you. You’re in the library, it’s telling you where your friend might be in the library. Instructions walking down the street. All of those kinds of things, but not in this little tiny field of view that’s up in your right hand corner of the glasses. It’s right out in front of you, very comfortable. You turn the glasses off, it’s absolutely clear to look through. You turn it on, you get these beautiful imagery that’s out there. And it’s done because Vuzix has Waveguide technology that we’ve been working on now for years.

Alan: All right. I’ve read a lot about Waveguide, I still don’t really understand it. Can you walk us through the basics of Waveguide?

Paul: Yes. So this is how it works. You’d have a lens, it’s flat. But it looks a lot like the outline of a regular pair of glasses lenses. And what we do, is we put a little hologram that’s really a surface relief grading hologram to kind of equate to the same thing in some ways. But bottom line is, it’s these little 150 nanometre deep, 300 nanometre pitch scratches on the surface of the glass. It’s a little tiny round circular dot, maybe two or three millimeters in diameter. And we project the light from a display projector, just like the projector, the front projector that you use in your living room to watch movies. But it’s tiny, custom built by Vuzix. And if you were to point that thing at the wall, you’d see an image up on the wall. Well, we inject that into that little two to three millimeter circle. And when it hits the circle, it bends into the glass itself. So now you’ve got a one millimeter thick piece of glass that the light is bouncing away from your eye towards your eye and propagating towards the bridge of your nose. So it’s in your temple, bouncing around back and forth. And at some point in time, it hits another set of gradings in front of your eye that allows the light to leak out and project this image out in space. So there’s a really thin piece of glass. You’ve injected this image into it. And instead of projecting out onto the wall, you projected out in front of you through this Waveguide. And because of the way our input and output pupils work on this, you can put your eye anywhere in the output set of gradings, and see this image.

Unlike North, where they are a little tiny pupil, this thing’s got as big as we want to make it. It can be an inch by an inch. It can be the whole glass. And anywhere you look through it, you see this image out in space in front of you. So they’re very, very forgiving. And the field of view is defined by the projection engine that injects the light into it. And a bunch of other things, but for simplicity’s sake, let’s just say that’s it. So with that, you can get small displays, thin optics and put them in form factors that start to look like regular glasses and that give you a much, much forgiving display systems. So you want one size fits all, you put them on, and the image is just beautifully out there. So we put that in the Blade on our first version. We have another version of the Blade that’s coming, Blade 2, which has got even sexier front end on it. And then if you project down the road a little ways, we’re developing some display engines that will be a third of the size of our current display engines and they will be a fraction of the power. We’re talking like two watts versus two hundred milliwatts for a fully lit up engine. So, significant reduction in power. Huge drop in sizes and nothing but sexier and more sexy over time here.

Now, the Blade itself, because it’s got this really cool form factor, it’s opening up opportunities from enterprise to prosumer that just haven’t been there before because it’s finally a pair of glasses that people would actually wear. And we talked early on about the security marketplace. Security is one of them. I mean, you know, wearing a Hololens as a security officer–

Alan: [laughs] You’d look like an idiot.

Paul: Yeah, that’s right. You won’t be taken seriously with that.

Alan: But the question really comes down to when are you getting Wesley Snipes to be your spokesperson?

Paul: It’s so funny you say that. We were at CES last year, and the guys– I don’t know where they were, but they happened to run into him when he was there, right? When they were out there at the show and they showed him the glasses because they’re the Blade, right?

Matt: [laughs]

Paul: And he just loved them. And we got a couple of pictures with him wearing them.

Alan: Oh, that’s so cool.

Paul: But he won’t let us. He’s like, “Well, you really probably shouldn’t.” because he didn’t really own that trade name, right? So.

Alan: Yeah.

Paul: But yeah, no, he’d be a great spokesman for it. And they’d look good on him at the same time, so.

Alan: That’s awesome. Yeah, I figured you’d be like, that is literally his MO is those glasses and like they’re perfect.

Paul: They are, actually.

[laughs]

We’re almost made after him, frankly.

Alan: And I love that the passion of your team to track him down and get him to try them on. That’s awesome.

Paul: Yeah, yeah. They’re– my guys are proud of what we’re doing here. We’ve been at it for a long time. Most of the folks here have been with me through it all. Although I will admit we’ve gone from 20 employees to 80 in the last three or four years.

Alan: Wow.

Paul: But, you know, everybody’s a shareholder here and they’re all very proud of the fact that we’re doing this really cool stuff. And quite frankly, competing with some of the biggest names out there today.

Alan: Agreed. Yeah. But I mean, the work you guys are doing is pioneering, not only the technology side, but the adoption side. You know, I keep saying it’s not a technology problem anymore. We got the technology. It’s an adoption problem. We’ve got to get people to buy these things and use them in there. And I think it starts with enterprise, obviously.

Paul: Yeah, we’re with you. I mean, even the Blade right now, at 1,000 bucks a piece is– they’re not really inexpensive, so. And it’s mostly enterprises that are using them. But I have to admit, I love to fly drones now and it’s because of my Blade.

Alan: Oh, so cool.

Paul: It is really cool because you can look through the glasses, see the drone flying out in front of you. And at the same time, get the video feed through the glasses.

Alan: Oh my god, that’s so cool.

Paul: Yeah. And it’s with a single connection to the controller. Or you can run it wi-fi, wireless to the glasses. And so it’s really a cool way to fly a drone.

Alan: Oh man. This is so cool. I’ve actually tried the DJI drone pilot goggles, whatever, the VR ones. Oh my god, they’re so nauseating. [chuckles]

Paul: Yes. I think most–

Alan: It takes a special person to get inside that thing.

Paul: A lot of those were designed for the racer guys, you know.

Alan: Yeah. OK. So one of the backbones of any hardware product is software. And without a solid software operating system, you really have nothing. And I know you guys have been working hard on your operating systems. You wanna talk to to us about the Vuzix BladeOS?

Paul: Sure. I mean, we have done a lot. First of all, it’s based on standard Android. And the cool thing about the M400 is, it’s the latest version of Android and I think you’ll see all the way out to 10 supported. But yeah, we built our own launcher and a lot of custom UI based interfaces. We have a ton of software on our developer site, on our website. We have Tier 1 and Tier 2 support for people who are writing applications for the Blade and/or the M400, quite frankly. And so there’s all these applications out there, etc. And by the way, on that front, since we’re here, we do have a developer contest with upwards of $110,000 worth of prizes. I think November the 4th is when it ends ,and it’s around the Blade. There’s — gosh — a pile of people that have signed up to develop software for it. Anybody who wants to do that, if they put in an application that works with the glasses, there’s kind of some minimum standard there, you can’t just see “Hello world” on the thing. But they’ll win a free Blade, also. And so it’s a great opportunity to get in the game, to learn about it, and to have your expenses covered for the cost of the hardware and stuff. If you put the app in, so.

Alan: Incredible.

Paul: On that software front, if you go to our developer site, you’ll see there’s a graphical styles, recommended styles, much like an iPhone and an Android have certain ways icons should look and et cetera. All that stuff is out there. There’s even shortly there’s gonna be ONVIF security camera driver support examples out there. There’s all kinds of demos and examples of streaming video for security applications and the likes, basically to get started. There’s a ton of stuff out there and the OS itself has been completely reworked to run in our form factors at the same tim, with APIs for voice input and the likes, APIs for barcode scanning, QR code scanning and the likes also available.

Alan: You can read barcode scanning as well?

Paul: Yeah, we have. We work with a bunch of companies that actually have barcode scanning software that they’ve written, that they’re just selling them as tools. And we also work with like zebra crossing and the like. And the drivers are built in with a common set of API calls. So if you’re using QR code scanning or barcode scanning for either just simple log-in kinds of things, right on through to a barcode scanning in a warehouse, there are tools available to just make function calls to our glasses to do that. Even when you pair it to your phone on the Blade, let’s say. So the Blade’s got a full ecosystem, that’s been written for it. There’s a companion app that runs on your phone. When you pair the two, you literally put the companion app on your phone. It practically comes up in pairing mode and it puts a QR code on the phone itself. And you look at it with the glasses turned on, with the camera running and boom, it does the pairing automatically. So it’s really simple to connect it to your phone, and it will run with Android and/or iOS phones. There’s a companion out for both. And that companion app allows you to easily push notifications and stuff from your phone from any application that might receive notifications to the glasses. So if you’re walking down the street and a text message comes in, the glasses wake up and the text message comes up on the glasses, just like it might on a smartwatch. Or turn-by-turn instructions can come up and do the same thing in the glasses. You can also turn those alerts on and off, as you run the companion app you can select what you want messaging from, so you’re not swamped. Because some people have messaging from Linked-In, messaging from Twitter. Yeah, it can be overwhelming. Bling bling, bling bling, bling bling. [laughs]

Alan: Actually, one at one of the things that I thought was really an interesting feature that North Glasses just pushed out, I don’t know, a couple weeks ago was basically they turn off notifications when they detect that you’re having a conversation with somebody. And I thought that was really interesting because the last thing you want is to distract when you’re actually having a physical one-to-one conversation with somebody. Have you ever be in a meeting, and people are checking their phones. But even worse is the watch, people will be checking their smartwatch while talking to you. And you’re talking to someone, all of a sudden you ask them a question and they’re not there anymore, they’ve kind of drifted off to check their messages on their watch. And glasses are going to get even worse, so I think having that functionality of knowing when you’re having a conversation with somebody to focus on the people in front of you. I think that’s great.

Paul: I rather like that, too. And you can tell, people wearing our glasses, they get into the glasses. And although I will say walking in New York City and the like with your face buried in your phone, this can be a better experience than that. I mean, for instance, Yelp trying to find a restaurant, instead of your head down, etc. with this, the glasses on, you just look and it tells you as you’re looking the restaurant. There’s one on the other side of the building. It will tell you that it’s over there and you can get just by looking and walking in the direction that you’re looking. You get information that’s related to the world around you. So in those cases, it kind of makes the real world work better. That’s the whole idea behind AR in the end. And even though it’s simple AR, Yelp works really well in our glasses for that kind of an application. I do like the whole “I’m talking turn off notifications while this person that’s close to me is talking.” That’s an interesting one.

Alan: Yeah. I mean, it can probably– I don’t know, they don’t have a camera on their glasses. So I’m assuming it’s just based on you’re talking, but you guys have a camera so you could literally do facial recognition and say “OK, somebody is within three feet of me, don’t show displays when there’s a conversation going and there’s somebody in front.” I think it’s a great feature. Your display’s a mono-display, right? So it’s in one eye?

Paul: Yes, that’s correct.

Alan: The one eye– so you’ve got the display in one eye and then you’ve got the camera in the other. What are your– and then this is a little bit off topic, what are your concerns around people driving with these technologies?

Paul: The number of car companies that have every intention of implementing AR and glasses inside the car is surprising to me. But I have to say a heads-up display can make things much more situationally aware. For instance, we are working with some motorcycle companies and if you look down at your motorcycle’s console to see how fast you’re going, or to maybe look at your phone for directional information that’s mounted up on the front, that time to look down and look back up, you can be in the middle of an accident. Whereas with the glasses on, if the imagery is floating down the road, you don’t have to look anywhere except down the road in the same focal point that your eyes are looking safely down the road with. And so they can be much more situationally aware than looking down. When in a car, you look down at your dashboard on the right to look at the map, that’s taking your eyes off the road. Whereas the HUD in my car, it’s all in the HUD and I can just look down the road. I think the same thing is gonna be true with glasses and it’ll get better with glasses, because the camera feeds and stuff that are around the outside of almost every car, collision avoidance, all of that, that stuff will be able to be portrayed in your glasses. So when you look to the right, you can literally look right through the car as if quarter panels weren’t there and stuff. So it’s about being situationally aware now. I’d be the first guy to say that watching Netflix driving down the road is not going to happen.

Alan: You know it’s a plain bad idea.

Paul: In this case, there’s going to be driving modes, just like there is in your car now. And your phone will not do certain things when you’re driving down the road. You’ll see the same thing happen in glasses, I believe.

Alan: Yeah, I think so. I mean, when I first got the North glasses, I was walking down the street and I almost walked into some poor woman, because I was paying attention to the little image and not my situational awareness. I only did that once. [laughs] Within the first hour. [laughs]

Paul: You learn that pretty quick. But I will say that I think binocular systems, this gets way better. Monocular systems, what happens is the display engine image gets put out in space somewhere horizontally left to right. And based upon where that is, even your convergence system– your eyes have a tendency, when they’re looking at something that’s only on one eye, to look as if that’s where it’s going to converge out in space, left and right.

Alan: Yeah.

Paul: Based on focus, there are some disparity issues with focus and with convergence etc, that most of that gets way better if the images are at infinity, they’re focused at infinity, and they’re binocular. So I think you’ll see binocular systems in the long run will be the better way to do this. But the display engines today currently are way too big to make really sexy glasses binocular just yet. But that’s gonna change so fast your head spins out.

Alan: You know, I haven’t taken the long view on all of this. I’m saying, “OK, 2025 we’ll have some AR glasses that are Magic Leap, Hololens, with all the bells and whistles. But in the form factor of the Vuzix Blade.”

Paul: [chuckles] Yeah. And maybe sooner. [laughs]

Alan: Hey, I’m going to go with 2025. If it’s sooner, great. That’s awesome. Nobody ever really slams you for making predictions too soon or too far out. They always kill you if you make a prediction too early.

Paul: You know the story of the frog that was sitting in the water and it slowly turns the heat up and it wasn’t smart enough to jump out.

Alan: Yep.

Paul: This industry is going to happen like that. All of a sudden it’s going to look back and say, “Holy mackerel, look how far we’ve come. This is amazing now.”

Alan: Okay. Let’s just put our “look how far we’ve come” hat on for a second here. In 2014, I tried VR for the first time, and I ended up with an HTC– the Pre, the first one. And I mean, that thing was like a giant fish tank on your head. Even like Pimax, they’ve got this VR headset, this 8K VR headset, but it’s like strapping a twenty inch monitor to your face. We’re going to look back at this and laugh. But if you look at where we’ve come, in VR specifically, we’ve gone from these giant supercomputer driven things to the Quest — which is a standalone headset, wide field of view, four year, four hour batteries, all the rest of it — in three years.

Paul: Yeah.

Alan: And AR glasses. I mean, the Vuzix Blade, that is a pair of glasses that you can wear all day, everyday, and that didn’t exist four years ago. I mean, you guys were probably working on it, but it wasn’t something that you could commercially buy. And now it’s available, and it’s just happening faster and faster and faster.

Paul: It’s very true. And the optics systems are getting better along the way. And with MicroLED coming, the display engines are gonna shrink huge. And when you only light up the pixel that you want, power consumption is going to go through the floor. I’m telling you, man, Kingsman’s–

Alan: And you power all that using cloud computing and 5G.

Paul: Right.

Alan: My last interview was with Sandro Tavares from Nokia. And they build the 5G infrastructure that will all rely on. And it’s interesting how if you factor out, let’s call 2025, or push it out to 2030. We all wear glasses. The glasses are super lightweight. The compute power is in the cloud, not on our face. So they’re super light, super cheap. And now our mission, we’re launching a new company next year and the mission of the company is to democratize education globally by 2037. So if you buy into the fact that will wear glasses in 10 years, those glasses will be running in the cloud. Add another five years to figure out how to make content at scale. And we should be able to theoretically give away the world’s most advanced, efficient, effective training and education to every human on earth. For literally nothing.

Paul: That’s a great vision. And I could agree with that.

Alan: Great. Because then I don’t think I’m so crazy. [laughs]

Paul: I tell you, Alan. Connecting the digital world to the real world is going to change so many things coming up.

Alan: I agree.

Paul: There’s a lot of people that say to me, “I’m never going to give up my phone.” And my mom still has a wired connection to her phone, to the wall. So I can’t discount all of that. But you are going to be able to do things that just can’t be done any other way. And there are going to be so many people that want to do those things. They won’t go back to a phone. They might still have a phone in their pocket for other use cases. But these things that are coming are game changing.

Alan: Agreed. You know what? Listen, we still have TVs. VR is not going to replace TVs, AR is not going replace your smartphone. TVs and computers didn’t replace books, even though tablets. Hardcover books still outsell digital copies. So when we invent new communication mediums, it doesn’t replace the previous one, other than color TV replacing black and white TV. But majority of times, they don’t replace previous communication mediums. It just makes a new one.

Paul: Yeah, radio is a case in point.

Alan: Yeah, we still have radios in every car. We still use a printing press. These technologies didn’t go away, they just became part of a complete communications tool box.

Paul: Yep.

Alan: And I think your Vuzix glasses are one tool in an arsenal that is creating enormous value right now for enterprises.

Paul: Well, it’s off to a good start at Vuzix. I mean, it’s been long years in the making, but finally it’s reached that point of critical mass. And I’m looking forward to this fall to start sharing with more folks some of the things that are happening in that regard.

Alan: Well, I can’t wait to see all the cool stuff that’s coming out and I can’t wait to get our views explained. I’m pretty excited to start building some cool stuff on it. So thank you so much for taking the time to share the information about Vuzix and to share your passion for this as well. It really comes through.

Paul: Thanks, Alan. We like to tell the story, so we appreciate guys like you to help us get the word out around Vuzix, too.

Alan: Well, it’s a great story. You guys have been in it from the beginning and grinding it out, because I know what it’s like to build hardware. Hardware is– there’s a reason it’s called hardware, because it’s hard.

Paul: [laughs] Touché!

Alan: You know, I promised my wife, I said we will never make hardware again. And I’ve stuck true to that promise. But it’s one of those things that I tip my hat to you guys, because you’ve taken on a world class challenge and you’ve met it with all success. So I wish you all the best in that.

Paul: Yes. Thank you very much, Alan. We appreciate that.

Looking for more insights on XR and the future of business? Subscribe to our podcast on iTunes, Google Play, or Spotify. You can also follow us on Twitter @XRforBusiness and connect with Alan on LinkedIn.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top