Brought to you by MetaVRse

If you sell a couch that comes in 1,000 different patters and colours, what’s cheaper: printing out a swatch for each variation, or creating a configurator that lets you do that digitally and photo-realistically? The obvious answer is the ethos behind ThreeKit’s product customization software, which CTO Ben Houston joins Alan to discuss.

Alan: Welcome to the XR for Business podcast with your host, Alan Smithson. Today, we have a very special guest, Ben Houston. He’s the founder and CTO of Threekit. Threekit’s a platform used by some of the world’s top brands like Crate & Barrel and Steelcase. And what they’re able to do is create amazing visual customer experiences through virtual photography, augmented reality, 3D imagery, saving companies enormous amounts of time and money, having to get these photographs, set up studios. And at a time of COVID, we just can’t do that anyway. So I’m really excited to invite Ben to the podcast today. Ben, thank you so much for joining us.

Ben: Hey, thank you, Alan, for such a great intro.

Alan: Oh, it’s my pleasure, man. You guys have really been working hard in the space. You’ve been in the space since, what, 2005, I believe?

Ben: In the 3D space for quite some time, but in doing 3D for e-commerce, we’ve been doing that since 2015.

Alan: Wow. So five years of experience. Let’s kind of go back to 2015. What did you start doing and what are you doing now? What are the services that Threekit offers, and how has that changed from 2015?

Ben: When we first got into it, we were actually– our background’s Hollywood visual effects. We started making– this company originally was creating software for Hollywood films, and we did that quite successfully on a lot of films. And then what we did is we started moving that 3D content creation to the web. Once we had done that, we did that around 2013. In 2015, people started using our 3D content creation for the web, for e-commerce applications. Specifically, they were doing it for configurable products. So interactive 3D product configurators. This is– Steelcase is a good example of an early adopter of this technology. We started doing that, and that had a lot of success, especially for companies that have massive configuration problems, such as Steelcase’s office furniture. As we evolved down that path, the next thing we started doing was virtual photography or also called synthetic photography. That’s where you will create a number of renderings of products for companies. A good example of that is Crate & Barrel. We’ve created hundreds and hundreds of thousands of renders for them, of their furniture, and it all looks real. And so now they don’t have to build every piece of furniture and every fabric and then take a picture of it. We can just render those off.

Alan: Gotta be some massive cost savings. We’ll get into the numbers later. But wow, that’s like– if you don’t have to take photographs, I mean, I can only imagine a photo shoot’s expensive to begin with.

Ben: And every one of their sofas is a couple of thousand dollars. So it’s just simply not possible. And then what we’ve done more recently is the rise of AR. That has really been embraced by furniture realtors specifically, and so that allows them to see the furniture, how it would fit in their room or office. And so those are the three main offerings that our platform has.

To recap, you have configurators, so companies that have — maybe it’s a chair — and it comes in 50 different colors, and five different lumbar supports, and people can configure their office furniture, or chair, or any product, really. The second is — which is a term I’ve never heard — synthetic photography, basically being able to create catalog images and website images, without ever using a camera.

Ben: Yes. And then the third is using those same 3D assets that we can use for the interactive 3D configurator and the virtual photographer. We can then also export or create AR experiences using those same assets, the same models, the same materials.

Alan: That’s incredible. So now a retailer has a one-stop-shop to do a lot of the things that maybe was being done by several firms.

Ben: Yes, it is a consolidation, and it does lead to a lot of efficiencies, because you only have to make your materials once, you only have to make your models once. And that configuration data is often run off a lot of business data, from an either ERP or some e-commerce integration. We only need to do that once and then you get all of these outputs.

Alan: Well, that’s interesting that you mention that, because one of the things that I noticed about your system is if you go to threekit.com and hover over “Solutions,” you have all the e-commerce platforms listed: Magento, Salesforce, Shopify, WooCommerce, BigCommerce and SAP. Maybe explain how that kind of would work. If I’m a retailer and I sell something, how would it work through your system?

Ben: OK, well, it’s basically an integration. Our configurators are a set of options that you specify, and then we can show that as attractive 3D AR or the virtual photography. We can integrate into the e-commerce platforms option system and then it just automatically follows whatever they pick in their e-commerce system. Our pictures or interactive 3D would follow, and then you can add that to cart and then we automatically generate a thumbnail of that configuration, then you can checkout. It can also then do post-order workflows. So often, like we have some companies that create 3D prints, so they configure product for just-in-time manufacturing, so we can help create that STL for you.

Alan: Oh wow. That’s like next level. So you’re not only visualizing and showing it, but you’re then able to take whatever they’ve designed themselves, throw it over to 3D printing and– oh wow, that’s a great pipeline.

Ben: Yeah, we do that actually as well for bombs. We can also do that for like 2D print files, like you’re making custom t-shirts or some type of custom cell phone case. So there’s a lot of different outputs and we’re adding more all the time, because then not only are you getting that 3D configuration, you’re actually getting a full business transformation, that you can cut out steps of your standard fulfillment process. It just becomes fully automated.

Alan: That’s amazing. And that’s what everybody needs it right now, and especially in a time where technology seems to be kind of blinding us to our human interactions. What this, I believe, will allow e-commerce companies to do is when this is all automated, you can spend more time answering customers questions and really developing a personal relationship with your customers. I think this is– it’s technology that allows for a more personal touch. It’s great.

Ben: Yeah, and just a lot of cost savings and scalability, too, because when you don’t have someone rejigging all your orders just to get it over to film it, yeah, your business can just scale.

Alan: This is incredible. Is it mainly large companies or is there small, medium sized companies as well? What are you seeing?

Ben: We see various different companies on the smaller end. We work with– I have to be careful. I can only talk about the ones that are public.

Alan: Oh, it’s fine. You don’t have to mention any names, but just maybe examples of how people are using it. Because if you’re a small retailer, a lot of times these solutions are just completely out of reach. And so I don’t think that’s the case in your world.

Ben: Yeah, like we work with– Pukka is a good example, they make custom hats. So these are like baseball caps or beanies, in Canada, they’re called toques. And so you can configure what any colors you want, you can do custom logos, as well as fonts on it. You can even upload SPGs and it’ll display them. Before they’d actually have to do this sort of an iterative process. Someone would request what they want. They’d send back a picture of here’s what their designer thought they wanted, and there’d be a lot of interaction. Now they do that fully on the web without any human involved, and they get a live preview, and then they can submit that to order. So that just opens up a whole new market they didn’t have before.

Alan: Ok, so let me ask you straight up numbers. And since then, does that increase sales?

Ben: Yes. At Audi — not our customer – but they saw 66 percent increase in user engagements once they switched to a 3D cart configurator from a 2D solution. And those who view 3D products are 11 times more likely to buy than those who don’t.

Alan: OK, hold on. Hold the phone, hold the phone. So you’re saying Audi used 3D configurator and saw a 66 percent increase in activity, in time spent on an object or what was that?

Ben: Yeah, that’s Audi reporting that they saw 66 percent increase, when compared to a traditional 2D solution.

Alan: Wow. And then 11 times more likely to buy.

Ben: If they saw a 3D product. That’s according to [Audi].

Alan: That’s incredible. So how does an organization get started with this? Let’s say, for example, I’m a manufacturer of desks and I want to have my desks in 3D. And do I have to have CAD models? Do I come to you and do you have a service that builds them? Walk us through the process from “I have a product that could could use this configuration, it’d be great in AR, and I would love to do photographs.” What is the step by step process of an organization looking to engage with Threekit?

Ben: It all depends on what is your product? If you have CAD models, that’s a really good place to start. If you don’t have CAD models, there’s other solutions. We can 3D scan your furniture and then turn that into high quality computer models or we can even work with photos. It is best, though, if we do have CAD models, but there is other solutions. Like we do purses and suits and many organic type objects without CAD models. That’s the modeling side of things. There’s also the material side of things. So if you have standard materials such as like stainless steel, or like a ceramic, or standard plastics, we have material templates for those. If you have more exotic materials, such as very specific car paints or various woods, we can scan those. We have one client, Tailored Brands. They own Moores Clothing in Canada and also Jos. A. Bank in the States. They have 1,300 suit fabrics, of which 600 are dark blue or black.

Alan: [laughs] Wow. How do you differentiate between those?

Ben: So we do high quality scans and those high quality scans can capture the subtleties in the bumps, the sheen, the different thread patterns, and so that each one of those looks unique and accurate, because when you have that many choose from, you really want to be accurate. Because what people see in your virtual photography, they’re going to expect. So, yeah, that’s where we get the models and that’s how we get the materials, the last aspect of that is we have to get your configuration data. What features go with which other features, and what is the total number of features? We can do that in a number of ways. One of the best is do ERP integration if it’s constantly changing, or we can do one time integrations or get it from your e-commerce system, or we can give you more full control over that if you want to have some way of customizing that and controlling that yourself.

Alan: So is there a self-serve platform as well where people are– I’m assuming you have some sort of content management system for customers to get in and make changes to skews and things like that?

Ben: Yeah, we have a number of customers who are doing self service, and that can vary from just managing which configurations you want to be showing. Let’s say you’ve got a new product, and you want to show only some of those configurations and release maybe more materials a couple months from now, or discontinue some options. We have our customers can also self maintain if they know some things about 3D, they can add their own materials and their own models as well. And we have some clients doing that. But that’s too much for other clients. So we can we can handle many different levels of self serve, depending on the needs of the client, their sophistication, and size.

Alan: That’s amazing. One of the things that you didn’t touch on, that I think is going to be a fairly large part of this is, we talked about configurators, synthetic photography, AR. But what about using 3D to boost ads? I know you partnered with Google recently, so maybe you can speak to that.

Alan: That’s actually from the same platform. Once you already have your models, your materials and your configurations, we can then output any one of those to the Google ad platform. And therefore you can– for some of our clients, some of our clients have trillions of combinations. That’s on average, probably they have tens of thousands, hundreds of thousands combinations. You can pick any one of those to send us an ad. And then depending on feedback, it’s very easy for you to start making A-B tests between different types of configured models.

Alan: That’s great.

Ben: The other thing we can do is we can also send those virtual photographs to Google ads as well. So you can do both 3D ads, as well as traditional image-based advertising.

Alan: Have you been doing 3D ads long? Is it– do you have enough data to say do the 3D ads convert higher than regular 2D ads, or maybe there’s not enough data yet?

Ben: Google has published some numbers on that. We’ve been doing 3D ads through Google for about a year, I guess. It depends on the product, and also the success of AR also depends on the product. Doing AR on a very big product actually can often confuse users. So the smaller the product — without it being like sort of jewelry — the better AR is for it.

Alan: Really? Huh.

Ben: It varies depending on the type of product. Images, as well. One cool thing with our image integration with Google ads is you can have someone try a configured product and say you’re doing a suit. And we have– I think we support over a trillion combinations for Tailored Brands. We can then create an image of that, a final image and then Google can use Pixel technology to start showing you that custom image that no one else has seen. And it can start showing you that on other websites, to encourage you to come by and complete that transaction.

Alan: Wow.

Ben: Interestingly enough, if we knew that you had bought a suit, we could actually start suggesting ties to you, and we could make images of the suit that you bought along with some suggested ties — if there’s a special in ties — and then show those. So that you can now have imagery that’s custom on demand for you, based on your style preferences or your previous purchases.

Alan: Wow. There’s so much to go through here. We talked about clothing and furniture, but some of the industries that you serve go pretty wide. You’ve got here furniture and home goods, commercial furniture, building materials, clothing and shoes, manufacturing, medical devices, jewelry, watches, kitchen, bath, and luggage. Of those industries, which ones are seeing kind of the greatest impact of this technology, or is it all equal?

Ben: Each one is being impacted quite a bit. It’s a mass success when you have a lot of a configurable product. I think the challenge, our solution doesn’t work well, if you’re selling like one product that doesn’t have configurations, because you only need one photo. If it only– if you saw what product and it comes in four different colors, it’s getting harder to justify our solution, because you probably have four copies of the product already, and you could take photos. So I would say that the more configurable your product is by your customers, the more valuable our solution is.

Alan: I’m looking at your clothing page and it’s a suit jacket and the suit can be blue or gray or beige. But the interior — and this is where men like to to get customization — the interior of the fabric could be thousands of different fabrics.

Ben: We did some really interesting work to get the NFL teams into Jos. A. Banks as interior patterns. That’s the suit brand owned by Tailored Brands.

Alan: Oh, man, that’s super cool. Who doesn’t want to have their football team in their blazer jacket, built into the liner? And the possibilities with this are endless. OK, so since we’ve kind of come from 2015 to 2020, what do you think is going to be the trend moving forward to, let’s say, 2025?

Ben: What I recommended or what I was suggesting earlier is where you know people’s previous purchases, and then you can start making contextual suggestions of additional purchases that match that. So let’s say we know that you’ve frequented this furniture retailer and you had bought a number of purchases previously. Well, then maybe they could show you pictures of your suggested purchase in context, or we could even learn maybe based on the style of that furniture, because we know more about that furniture than people normally do with just a SKU, we can actually go, “Well, this style of furniture actually looks really nice with this other one” so if you have brown end tables, you probably want to have a couch that matches that if you’re upgrading your couch. So I guess that’s concierge-ing, suggesting additional products that go with your existing purchases. And we could show that visually. And we can also just make those recommendations based on that deep knowledge of the products. The other one, which I’m really excited about, is additional trial. So with this LiDAR that Apple is pushing pretty hard — I’m pretty sure Google is going to follow suit pretty quickly, probably one year behind, and then they’re going to have LiDAR on their top end phones, probably coming to Samsung — but try-on is going to be made really possible, try-on of suits. You’ll be able to try on that suit and then see it. So that’s AR, plus really high quality LiDAR scanning, skeleton detection, and then clothing simulation. And then you can try on a baseball cap at home. You can try putting on a tie. It could even start suggesting that this is the right size for you. It could suggest, “Well, we know you have that suit, you don’t even have to put it on. But I can start suggesting ties for you.” Scan your room and it knows which furniture you’ve already bought and then it makes suggestions based on that. So AR becomes easy. You don’t even put like the new furniture down, you just scan your room and it goes “By the way, here’s where I think the table should go. This is the table I think you should buy.” Then it goes “Well, here’s four other suggestions that I also think are [suitable].”

Alan: That’s crazy.

Ben: It’s going to be pretty cool. I wonder if you can even scan your videos. “By the way, here’s the lights I suggest you change, and this is how your room would look different.”

Alan: Oh, how cool is that.

Ben: Yeah. So that’s where we’re going. And it’s great that Apple is pushing this pretty hard. There may be AR glasses as well. There’s a lot of rumors [that] Apple’s working on that.

Alan: And Facebook announced their research version of them; Project Aria.

Ben: Yeah. So that would be very cool. And all of these things just become a bit more integrated if it’s in your glasses. Though I do think that’s like– I don’t know if it’l have wide adoption by 2025.

Alan: No, I think the glasses, my guess is– well, who knows, man? Nreal is launching their glasses as a consumer glass with all the telcos. I know they’re working with Deutsche Telekom and T-Mobile and a number of other companies. So these glasses are coming to the market. I just don’t think there’s a massive enough base of content that is viewable on them. So we’ll have to see. But there’s going to be some options.

Ben: Yeah, yeah. LiDAR plus AR glasses is a very interesting combination.

Alan: I totally agree. And the new iPad Pro has LiDAR built into the front sensor, so you can now scan your room. And there’s rumors that the iPhone 12 will have LiDAR scanning as well on the front-facing camera. They already have it facing your head — which the face unlock uses — but having– being able to LiDAR scan your room and understand that, not only the image of the room, but the actual structure of it, where the table is, the floor– the analogy I tell to people is, it’s like Pokémon Go if the Pokémon could hide behind things.

Ben: But then also it will recognize your furniture.

Alan: Yeah.

Ben: So you don’t even have to enter in like a lot of people want to do home planning. But right now, if you try any of those home planners they’re like, “OK, entering your walls, OK, now put your existing furniture in.” And if you want to do that, you’re in for hours of work. But in the future, it’ll just scan your room.

Alan: Yeah, it’s going to be so easy. And I know you’ve seen demos like that, because I’ve seen some of them, like, “Wow, the future is already here.” [laughs]

Ben: I think that going into stores will actually be like a downgrade from what you could do in your home. The only thing you’re getting is you’re able to touch it. You’re able to touch that material, able to sit on that sofa. But other than that, you’re not missing out that much.

Alan: So, OK, what are some of the things you’re working on now, that you’re super excited about that is coming in 2021, let’s say, what are some of the things you’re really excited about? Maybe push new technology? What do you want people to know that’s coming in 2021 for you guys?

Ben: We’ve got a big push on configurable AR, and we’re going to be pushing even harder on that. We’ve got wall placement coming. We’re looking at a technology called WebXR. But right now, reconfigurable AR technology already works both on Apple and on Android phones. And then I would say that’s probably– a lot of it is also increasing the ROI. So ensuring that the platform is low cost to you and that it provides the most benefit.

Alan: I think this is where a lot of early AR start ups and VR startups in general missed the boat. They started creating a technology, and then they had the technology and it was looking for a solution. Being able to create ROI directly and demonstrate that ROI, that’s really what customers at the end of the day want, especially B2B customers. I’m enamoured: I’m looking at a DeWalt drill right now on your demo site. Their demo site — just so if you want to know and go and try these things out — there’s a whole site. If you go to threekit.com, go to the “Resources” tab and down to the “3D Product Library“. And there is bracelets, backpacks, sound systems, high heels. Is there a product category that you guys *haven’t* touched yet, that you think would make a good one?

Ben: We really have a quite broad set of clients.

A rad bike helmet designed using ThreeKit’s 3D configurator.

Alan: Yeah, it doesn’t look like– it looks like you cover everything: there’s a tent here, a watch. It’s really, really great. And the fidelity on these objects is really beautiful. And it’s all real time and it’s all running on the web. So one question I actually had from a technical standpoint is how hard is it to put these things on the web, compared to something like an app or maybe using Unity for a configurator? What is the difference between workflows on Threekit, versus Unity or even Unreal?

Ben: Well, let’s see. I talked about the evolution. We used to do everything through a previous platform we had called Clara.io. A lot of the current technology evolved out of Clara.io’s online 3d editor that we made. We made a new platform called Threekit, specifically for the 3D e-commerce market that we’re in now. What it did is it made a number of major changes from Clara.io that made it very powerful. It actually has models separated from materials, separated from — say — textures. And then it also has an e-commerce catalogue and option set. And so all of that together makes it very low cost to scale up, both in terms of products and options across your catalog. It’s– in some ways the asset management system is very similar to Unreal Engine. But the advantages that it has over, say, Unreal Engine is everything’s on the web, optimized for the web, and our runtime is very, very small. So Unreal and Unity do have abilities to go on the web. But if you try and use them, you’ll find out that everything is very slow, because it’s loading a full game engine, or at least a partial game engine on the web. And you have to create everything in that game engine. Whereas ours, you create it in an online platform that actually is designed around e-commerce and configurations, and then everything is optimized to be small. So that’s sort of the difference. The other thing that’s different from those existing platforms is that we have a render system that’s very similar to how it works in visual effects. So this is how you create your stages, your sort of backdrops, and then you create your virtual photography using that. That’s also something that isn’t even in Unreal Engine or Unity at all. So I guess we’re sort of a platform that’s reimagined, inspired by Unreal, inspired by 3D content creation systems like Maya, and then web-based systems like Google Docs. It’s sort of taking all those ideas and then making a system that’s perfectly tailored for the needs of e-commerce.

Alan: So maybe you can speak to actually that for a second, because I think this is something missing from all the other systems that I identified a couple of years ago. We were doing a project on WebAR, and we had an agency as a client, and then we had the client, and then we had the brands, and everybody had to approve it. And the back and forth, we couldn’t send the 3D model, so you’d end up taking five or six photographs or snapshots, sending it over, they’d say “Change this.” It was a disaster. So talk about maybe your versioning and how that works. And it’s a it’s a feature that nobody talks about, but I think it’s vital to e-commerce.

Ben: I think it’s vital, and it’s very important when you’re actually doing the project. But it’s really hard to communicate when someone’s looking to sort of license or buy our product, because the people that are buying our product — and this is about maximizing ROI — what people are looking at, they don’t usually judge it based on that impression. They only find out about its value once you’re in the project.

Alan: But it is one of those things that they really do need to think about.

Ben: Yeah, our version is based sort of on some developer technology — it’s not using this technology — but it’s inspired by something called GET. And this means that everything is versioned, and we know every change that was ever made. So we can go back in time to the present. We can also make branches. So you can take like the current accepted version, then you can have an artist start say, making a bunch of fixes to a number of materials, and then they can propose that back, and then people can look at how that looks, and then accept it or not accept it. And so then we have a review system as well, where then you can have people sign off on those changes. So, yeah, it’s basically based on reviews and approvals, and then branching and versioning. It sounds complex, but it actually works really well in practice, because you don’t get changes you didn’t want in the main system.

Alan: So I think one of the questions I have now is what are the limits? Because it’s in a web, obviously Unreal. I think their latest ad was “We’re pushing a billion polygons in the PS5”. So obviously PS5 or PS4 and these game consoles, they offer far more rendering power with dedicated GPUs and everything. How are you then able to take this web-based 3D and provide it to Android and Apple? And of course, everybody’s got different types of phones, and there’s no– Samsung S20’s great, but what if somebody’s got the Samsung A version, which is a lower price model? How do you get around that? What is the kind of limits that you recommend or have? Are there limits to the size of the projects, and the detail and fidelity of these projects?

Ben: There is, depending on the options. So if someone has a very complex product and visual fidelity matters a lot — like, say, the suits, and those 500 or 600 fabrics that are all dark blue and black — that one’s really hard to do in real time, such that you can capture the subtleties of the material. So those I wouldn’t recommend you do real time, but use the virtual photographer product, or that feature capability, because then you can spend more time and make a perfect rendering of that fabric, that captures its subtleties. And then it’s just an image that was on your website. But if you’re doing something that’s simpler, such as like an office chair, while they do have dark blues and blacks, you don’t have to tell the difference between like 600 of them. And therefore maybe you can drop the fidelity a bit and then go for that real time aspect. We do recommend that your models are generally under five megs in size. Smaller is always better, because it loads faster and load time doesn’t matter for e-commerce. So I guess it’s a different trade off for every customer. But you want it to load really within a couple seconds, if not like two seconds. But some of our customers are willing to wait a bit more, because maybe they’re more in the B2B space and in the B2B space, people are willing to wait more for load time. They’re not quite as fickle as a B2C sort of situation, where there’s many different alternatives for that specific product and someone might get bored if they have to wait like four or five seconds. But if you’re configuring a workbench, you’re probably willing to wait like five seconds and then get a much more detailed model or very complex configuration setup. But it is up to the client to decide where they want their fidelity to be, based on what they’re willing to put up in load time. And of course, we use industry best practices, optimization of various types to try and minimize that, too, so that we can give the best experience with the app load time.

Alan: So what is the largest project file size that you’ve had to work with? Or has there been a customer who said, no, I really want to push some crazy 10 megabyte file? Is it possible or…?

Ben: You *can*. It would be a horrible user experience. So, no, we don’t do that. We would push them towards image based configuration, then. The files we use to create the suits are hundreds of megabytes in size. But we turn that into an image, and then it’s less than a meg download, probably like 100 kilos. And that’s the best way to do it, right? So that’s a different way of optimization. But nobody would ever choose 100 megabyte download when you can get half a megabyte. And that half a megabyte actually will probably look better than some janky slow interactive WebGL where you can’t tell the difference between those 500 dark blues and blacks. Once you hit sort of the capacity limit of current technology, we then will probably push you to an alternative solution that’s still on our platform. Or you can do both. I do think that when we start doing interactive try-on, those suits will not look as good as the suits we’re rendering right now in images, but you will be able to see how it fits. So maybe it might be a combination of let’s see how it fits on my body and it can maybe fit to your specific proportions. But then when you want to pick the fabric, you go to an image based configurator. It’s sort of like with AR right now. The AR that we do for furniture is not as good as our our virtual photography. But if you want to see it fit in your room, you don’t have a choice. So you look at the pictures, you see how beautiful, how subtle the reflections are on that fabric, how that velvet interacts with light. But then you go into AR and you see if it fits in your room. So by using a combination of technologies, you get all the benefits.

Alan: Wow.

Ben: That’s like the best example. Wicker cannot look good in AR right now. It’s just too many details.

Alan: Huh. Who would have thought wicker? I mean, I guess this is just things you come up with as you do thousands of products. You’re like, “Oh, that one wasn’t so good.”

Ben: Well, that’s because it’s a lot of individual threads, graphs that you can see. And so you can mimic it in AR, but it just doesn’t quite look as good as you could make it in a render.

Alan: Amazing. Well, Ben, I could literally interview you all day long. I love this stuff and I love what you guys are doing at Threekit. It’s really amazing. So without blathering on too much longer, what problem or challenge in the world do you want to see solved using XR technologies?

Ben: Well, I think it’s try-on, try-on of clothes. That’s like the frontier that we’re waiting for. I think AR glasses are going to be interesting. But I think that once you have try-on for clothes, facilitated via LiDAR and sort of body fitting.

Alan: I’m literally, as you were talking, I’m looking on your site at a golf shirt, with a dragon on it and there’s different patterns, it’s super cool.

Ben: Yeah. AR glasses are going to be cool, but I don’t– I think they’re just going to be an accelerator. I don’t know. It actually might be very cool when you can redo everyone’s rooms in context. Advertising in AR glasses might be interesting.

Alan: Yeah, well, Unity– I don’t know if you know, Unity went public this– er, last week, and 62 percent of their revenue comes from the ad division.

Ben: No way. Really?

Alan: Yeah. Yeah. Look at their IPO filing. I had no idea.

Ben: Huh.

Alan: Yeah. It’s going to be a thing, for sure.

Ben: What do *you* think? Where do you think the market’s going? What is most interesting things you see coming?

Alan: So in the short term, I believe that, just take away AR and all the Rs, just 3D on web. The web is a ubiquitous platform that lets anybody experience it without apps and any of that. It’s just a nice connection between a brand or a company and the consumer in a way that maybe they haven’t seen before. As more and more consumers grow up with AAA video games, I believe that the experiences that we deliver have to be at least that kind of photorealistic quality that people are used to, or at least AAA gaming. We’ve only scratched the surface of what’s possible delivering 3D on web. And of course, you’re offering AR and it’s unlimited photograph system. I think those are the most compelling parts right now. And we almost got to a point where we’re inventing new technology before selling the old technology to people. We have stuff that works now that is immediate. And everybody, of course, wants the new new. But I think there’s so much value to be gained from the existing technology, just showing things on 3D on a website, and we’ve only scratched the surface of what’s possible in that. So I think leveraging the technology that we already use every day that we maybe take for granted, but consumers, they want to try new stuff. 3D is a novelty right now and it’s new. And I think brands need to really leverage it immediately.

Ben: Yes, that makes sense.

Alan: I’m looking at a Lamborghini right now. It’s awesome. Well, Ben, people can reach you at threekit.com. You guys are based in Ottawa, Canada.

Ben: We actually are headquartered in Chicago. We also have offices in Chicago, San Francisco, New York, London, and Paris.

Alan: Amazing. So you guys are a global company now, but it was started in Ottawa, wasn’t it?

Ben: Yes, it did start in Ottawa in 2005.

Alan: Canadian companies, whoo! We’re kicking ass and taking over the world! It’s awesome. Well, is there anything else you want to share with the audience before we wrap up?

Ben: No. Thank you, Alan, for the offer.

Alan: Ben, this has been so great. And I want to thank you for being on the show. And I want to thank everybody for listening. This is the XR for Business podcast, with your host, Alan Smithson. And don’t forget to hit the subscribe button, because every week we have a new episode coming out, and you really don’t want to miss amazing interviews like Ben. Thank you so, so much. I wish you all the best success with Threekit in the future. And I know you guys just raised a Series A round, so I wish you all the best success. And hopefully we’ll see Threekit as an IPO story in the next few years.

Ben: Well, thank you.

Looking for more insights on XR and the future of business? Subscribe to our podcast on iTunes, Google Play, or Spotify. You can also follow us on Twitter @XRforBusiness and connect with Alan on LinkedIn.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top