A lot of XR technologies started from projects in the military sector, including AR. Today’s guest Nick Cherukuri is working to bring what he’s learned from years working in tech for the defense department, and is bringing it all to enterprise — and eventually, consumers — with his line of AR glasses.
Alan: Hey, everyone, it’s Alan Smithson here today. We’re speaking with Nick Cherukuri, founder of ThirdEye, about their all-in-one AR glasses hardware and software solution for enterprise in logistics, manufacturing, and engineering and how these tools are revolutionizing how we work. All coming up next on the XR for Business Podcast.
Welcome to the show, Nick.
Nick: Thanks, Alan. Glad to be on.
Alan: I’m really excited. You guys have a really original and awesome looking pair of glasses for the enterprise. Walk us through your X2 glasses and how are people using them? What makes them stand out from the competition? What’s the form factor? Just walk us through your solution.
Nick: Definitely. So just to provide some background about ThirdEye, while we may be a relatively new name in the commercial space, we have over 20 years of experience developing this technology for the United States Department of Defense. So that’s our origin story. And as you may know, the military, a lot of the technologies we use today have evolved from there, so for example, the Internet, GPS, even Siri for your iPhone originally came from SRI, which is right down the road from us in Princeton, developed there and Apple bought it off them. So the military has been a great incubator for these advanced technologies.
And augmented reality, it’s definitely considered the next major tech platform. So we’ve been developing a lot of AR hardware and software applications for the military. And a few– a couple of years ago, we decided to take some of our technical know-how, our leading engineers — we have state-of-the-art labs here in Princeton, New Jersey — and we decided to develop a commercial product. So we spun off into ThirdEye and we created– just this year, we– earlier this year, we released our X2 mixed reality glasses. So there’s just some high-level overview of the X2. We wanted to really address the customer concerns. We felt this was an optimal time to get into the commercial market. So we feel it’s too early for the consumer market right now, but the commercial AR market is definitely something that we are seeing a lot of traction happening.
So we wanted to develop a pair of glasses that really hit some of their needs. And some of the needs we heard were the glasses had to be entirely hands-free. For example, many workers, they have safety requirements, where they cannot have any wired packs. So you can’t have a wired processing pack or a wired battery pack. It needs to be all hands-free, compacted to one pair of glasses. So that was perhaps the most critical use case that we heard, that this is– you had to develop the glasses in a way that’s entirely hands-free. So we made our X2 glasses entirely hands-free at about nine ounces form factor. So it’s something that can be worn for a lengthy period of time. Another use case that we listened to was, it has to be attachable to a hardhat. So the glasses could be as advanced as you want, but if it can’t attach to a hardhat, or to a bumpcap, and meet some basic ANSI industrial certifications — ANSI Z87 — then it can’t be used in these industrial settings. So that’s something that we definitely incorporated into our glasses, to be attachable to a hardhat, and to a bumpcap.
Our glasses are Android-based, so it’s really easy to make access for upgrading to Android 9 soon, so we can take advantage of features like GPS, built-in. We have about a 42-degree field of view. So a binocular field of view is something we have seen customers prefer over a monocular field of view, because it’s less eyestrain. Binocular field of view is more natural to the human experience. We have two eyes, not one eye. So we wanted to develop a binocular pair of glasses, which we did. Our brightness level, we have about 300 nits of brightness in our optic system. So this can be used both indoors and outdoors, which is not the case for some other binocular glasses, which are more indoor products. So we wanted to develop our glass in a way that can be used both indoors and outdoors. It’s lightweight, it’s less than 10 ounces. So it’s something that– it’s easy to wear for a lengthy period of time without feeling any ergonomic issues. And we added some sensors like a built-in flashlight, a 30-megapixel camera. So if you’re running a remote help application for your field workers, you can stream really high-resolution content from your field worker point of view to a remote expert hundreds of miles away.
And we also have built-in SLAM. So SLAM stands for Simultaneous Localization And Mapping. We developed our own proprietary SLAM software that runs on our glasses that’s customized for hardware, so we can do inside out six-degrees-of-freedom tracking. We can do plane detection. So, for example, you could have a hologram of a 3D engine hovering in midair and the user could walk around it. So there are only a few glasses right now that are able to do highly accurate SLAM and our SLAM is accurate. It has about a one-inch drift accuracy and can be used both indoors and outdoors. So you can move your head around rapidly and the hologram will remain fixed in place. So SLAM is something that lets enterprises do more advanced applications, not just 2D AR screens, but you can actually interact with the real world. You can tag mixed reality content onto a giant machine, and leave the room and come back. And it’s still tagged on that location very accurately. So, for example, a worker could have step one, step two, step three, mapped out onto a factory floor, and be able to do that on a daily basis.
So it’s something that allows the glasses to actually interact with the real-world environment, which is where we see the cutting edge mixed reality software development happening for enterprise. So we’re really targeting the industrial, field services, manufacturing, and healthcare sectors. Those are our four main enterprise sectors and we are involved in a lot of other sectors. We have a lot of gaming and entertainment–
Alan: I’m sorry, Nick. What were the four? Industrial, field service…?
Nick: Industrial, field services, manufacturing, and healthcare. So I’d say those are the four. And we partner with many of the leading AR and mixed reality software companies, who have their applications running on our glasses. So we want to create as wide a developer ecosystem as possible. We don’t want to have a closed system, where we only have a few apps, or make it really tough for developers to develop applications. We want to have an open-source operating system, provide a lot of documentation. We have a Unity SDK. We have an Unreal engine SDK. It’s been fairly easy for developers to port applications or create apps on our glasses.
One last point — which is also important for a lot of our partners — is the price point. So we’re about $1,800 price point, which is roughly half the price of some of the other mixed reality binocular glasses out there. So we want to keep the price point as low as possible, and really help these enterprises and software companies deploy in scale. And we also offer leasing options. So that’s something where you can spread out the costs over a longer period of time. So those are some of the aspects on the technical side, software partner side, and pricing side, where we’re trying to accelerate some of these enterprise deployments.
And what we’re seeing is, whereas before it might have taken close to 12 months to escape a pilot purgatory into a larger deployment, now we’re seeing enterprises — because there have been success stories right now — deploy within three to six months. So from the initial testing phase and getting all the key players on board — from the innovation department to the business department — and then going to a larger-scale deployment. So I think right now, because there have been a lot of lessons learned from other companies as to what works well for these smartglass deployments and what doesn’t work well.
We definitely really try to incorporate a lot of the feedback and listen to the customer, for something as simple as built-in device management. That’s something that is absolutely critical for enterprise deployment, because you need a central IT person in the company being able to control a deployment of, say, 200 glasses. And that’s something that for a long time wasn’t integrated into the glasses directly. And that’s something we wanted to have built out of the box. So right out of the box, enterprise can scan a QR code and register the glasses into their central database. Also, there’s been a lot of lessons learned and we definitely wanted to listen, have an open mind, and listen to as much of the feedback as possible. And we think the enterprise market for the next five, six years will be the main push for these glasses to be sold in the consumer market a little later on. But enterprise market is seeing some tremendous ROI at the moment.
Alan: So you’ve talked about that, the glasses themselves, the software. Where would somebody begin? Like, what’s step 1 for a company that says, “OK, you know what? We see that there might be a value here.” How can they learn more?
Nick: Sure. So what we’re seeing in the enterprise space is they typically come at this from two aspects. One is they actually have a real issue they want to solve with these glasses. For example, training new workers. They might have an aging workforce. They need a hands-free system to effectively train new workers. And that’s something where a pair of our glasses, plus a software partner — like Atheer, Ubimax — can really help accelerate the training of the workforce.
Another way they come at this is their innovation department wants to see how AR could be used, that they might not have a specific use case, but they have some general idea about augmented reality, something that’s an up-and-coming technology, and it’s something that we want to incorporate in our enterprise. So for those, we take a look at what are their biggest issues at the moment, and what software plus what other glasses would work the best for their use case. And sometimes it’s not always binocular glasses, sometimes it could be a monocular pair of glasses. But typically, we have found that enterprises prefer a binocular field of view, because it’s less eyestrain on their workers and it feels more natural, there’s more applications that can be done on them.
But I think right now what we’re seeing is in nearly every industry, even really narrow ones like utilities or wastewater, that’s really specialized, AR software companies who are targeting each of these industries. So I think every industry right now has some augmented reality software that is really effectively targeting like industry-specific use cases. So what we try to do is we partner with a software company. So depending on what the enterprise is, if they’re a field service company, or telecommunications, or industrial, or a visually impaired group, then we partner with the software company that has the best software that deals with that use case.
So we’re seeing a proliferation of these AR enterprise software companies really expand right now. And what’s really great to see is they’re really– they’re not just making cool technology. They’re actually targeting a really specific enterprise use case. A lot of these companies have people who have come from those industries and are now starting these companies. So I think right now this enterprise space, at least, has a lot of software companies that are really targeting specific use cases in their industries they want to be in, and that makes it really valuable for a business to use their software, because it’s really targeted. So I think it’s definitely– that’s one reason why it’s taking off at the moment.
Alan: If you kind of put your futurist hat on, we’re talking about kind of enterprise augmented reality and mixed reality. When do you think — put your futurist hat on and your prediction hat — when do you think mass consumer adoption of this technology will occur, and when do you think Apple is going to come out with their glasses? Because this is going to change everything, when they come out with their glasses.
Nick: Definitely. And that’s a great question. And I think long term, that’s what everyone’s predicting is, this is going to replace your phone. And it’s just a matter of what technology has the right features to make consumers want to replace their phone with a pair of glasses. So what we see is there’s a couple of core challenges right now, that need to be addressed for consumer deployment. So, field of view is probably the biggest one. The natural human field of view is, I would say roughly around 210 degrees. And right now, the widest field of view glasses — that are in mass production — have between a 40 to 50, 55-degree field of view. I know there are some prototypes that have 70, 80, 90-degree field of view, but in actual mass production now, that’s the field of view that’s there. And I think for consumers, they would want a really wide field of view. As opposed to enterprise, where a narrower field of view helps achieve their ROI use case, so they don’t really need a massive field of view. But I think the field of view definitely needs to be increased, and every year it’s definitely being increased by some of these leading optics companies. And once that progresses closer to what feels like a natural human experience, I think that will really help propel this smartglass for consumer deployment.
Another important factor is the form factor. So I think for consumers, they want it to look like a cool pair of Ray-Bans or glasses. And even technical features like a battery, there needs to be way to reduce that in size. Right now, some of these consumer companies, they’re finding a way around that by having a wire that goes behind your ear to connect to your phone as a processing pack, and that’s the battery source. So that’s one way around it. But we think that consumers want it to be entirely hands-free, just like enterprises want their device to be hands-free. They don’t want to walk around with wired packs or anything, and they want it just to be a pair of nice, cool-looking Ray-Bans they can wear.
So I think the field of view, battery, and getting some of these sensors and processing down into a really tight, small form factor, all-in-one without any attached wires or processing packs is what’s needed to make this eventually replace your phone. But I think until then, there are going to be workarounds, such as using a wire pack or using your phone as a processing pack. And there may be some consumer uses with that use case, but I think that’s the eventual prediction for what’s needed to replace your phone.
Alan: You skirted around the timeline.
Nick: Sure, I did. [laughs]
Alan: [laughs] Nobody wants to put their name on this. [laughs]
Nick: That’s hard to predict right now. But I mean, there’s these great market studies — like from Goldman Sachs and these massive companies — that predict that, I think the next 10 years or so, the consumer– there’ll be roughly 25 million glasses sold and the consumer version will come out like within the next 10 years or so. So I think definitely within the next 10 years, we’ll see a standalone pair of glasses that– because everyone wants high-end mixed reality technology into a pair of Ray-Ban looking smart glasses, which right now isn’t electronically feasible. But who knows? In 10 years that could definitely happen. So it’s hard to predict for consumers.
Alan: But here’s the thing, and the reason I brought up Apple. It’s because they’re quietly building whatever they’re building. Nobody really knows. And there was this rumor that came a couple weeks ago that Apple’s releasing their AR glasses in 2020. I just– knowing the technology, I’ve been to research labs where they’re pushing out 70, 80 degrees field of view. And I’ve been to the research labs. I’ve seen the stuff that’s still in the lab and it’s still not even close, even though it’s remarkable. It’s not even close to being what we need for consumer. And I really have my reservations around Apple coming up with anything next year that will serve the needs of the mass consumer market.
Nick: Definitely. And I think from Apple’s perspective. I’m not sure if they want to connect to their iPhone, to use that as a processing pack for some of the glasses power. So I’m not sure what exactly– what approach they’re taking, but definitely to get the features that are– that everyone wants in these glasses, I think we’re still a long ways off, especially the field of view and to getting everything all-in-one to a really small form factor. That’s– there’s still some really critical features like battery, for example, that it’s going to take some time to reduce that in size.
Alan: What are some of the use cases that people are using right now that are generating the most ROI? Like, if I’m a procurement manager at a manufacturing facility, what can I do to wow my executive team by firing a couple of pairs of these glasses and starting right away? What’s the lowest hanging fruit that I can get immediate ROI, because that seems to be how these things are being unlocked. You do a pilot or a small thing, you show an amazing ROI, and then it unlocks the budget to roll this out at scale. So what is that?
Nick: I mean, there’s so many applications, but kind of the way we like to talk about it is just like computers have the Office Suite of Microsoft Word, Microsoft Excel, Microsoft PowerPoint that nearly every organization uses those applications. Similarly, for these smartglasses, there’s some overarching applications that we’re seeing are being really commonly used across industries. So that is remote assistance, so someone can see what you see. And it’s great for training new employees and providing remote help. Manufacturing checklist, QR code scanning. So if you’re in a warehouse or manufacturing center, getting step by step instructions overlaid for your specific task flow, or being able to scan QR codes with the glasses and get visual instructions. And then the other most common application is 3D twins and more mixed reality, where you have a 3D digital rendering of real-world objects. So for example, with mixed reality with SLAM, you can scan your environment into a 3D model and you can tag information, any information you want onto the real world, in this– using this virtual 3D model.
So those are the three applications that we’re seeing are being really commonly used. And I think what we envision — and based on the feedback we’re seeing is — those will kind of be the Microsoft Word/Excel/PowerPoint of AR mixed reality, where they’re just really commonly used across a lot of industries. And if you look at most of the AR software companies at the moment, most of them fall into one of those three application categories. And I think the reason is there’s such an immediate ROI it just makes so much common sense, where you’re hands-free, you don’t need to carry around a 50-page manual or use your hands. 80 percent of the global workforce use their hands while they work, so they need an entirely hands-free digital interface, while still being able to walk around and do their daily tasks. So I think those applications are seeing or having some tremendous ROI, we’re seeing close to a 40 percent savings in task time for some of these companies. And a lot of savings in training new workers. Worker safety. And you can really customize the task flow for your individual company needs. So let’s say those are the three most common applications and there’s really some really great AR software companies who are developing. That’s all four.
Alan: My last question for you, Nick. Because this has been really amazing, looking at ThirdEye’s technology, looking at your kind of overview of the marketplace, how people are using it. What is one problem in the world that you want to see solved using XR technologies?
Nick: So one personal preference of mine, and that’s the beauty of mixed reality is definitely in the healthcare space. We’re seeing some really tremendous real-world use cases. So there’s roughly a 200-million worldwide vision-impaired market, where people who can’t see properly or have some type of vision impairment. And with XR, they’re able to really change their lives. And with the addition of 5G and cellular chip directly built into glasses, they can walk around their daily lives using these glasses. And it really helps change their lives on a really personal level. These other use cases are great in terms of worker efficiency and proving KPIs. But from an actual impact in someone’s personal life on a daily basis, some of these healthcare applications — and specifically these vision-impaired community — using these mixed reality glasses is really helping change their lives in a positive way. I think so many times you hear of technology having a negative impact on the world. But I think with this XR technology and with these small streamlined glasses, we’re seeing some really positive and heartwarming use cases. And I think that’s great to see. And definitely healthcare space is a personal favorite of mine for some of these AR mixed reality glass deployments.
Alan: Well, thank you, Nick. Thank you for taking the time to join us. Where can people find out more information about ThirdEye?
Nick: Sure. So you can visit our website at thirdeyegen.com, follow us on social media. And if you ever want to reach out, just hit me up on LinkedIn or social media, and I’ll definitely try to respond. So we want to help expand this community and we go to a lot of the major trade shows. So you’ll probably see me there. And looking forward to continuing being in this space and seeing where it goes.
Looking for more insights on XR and the future of business? Subscribe to our podcast on iTunes, Google Play, or Spotify. You can also follow us on Twitter @XRforBusiness and connect with Alan on LinkedIn.