Brought to you by MetaVRse

The ability to bring the sense of touch into the virtual is the final frontier of true immersion, and some of that technology already exists. Haptics, however, can be prohibitively expensive, even for some enterprise. Gijs den Butter visits the podcast to explain how SenseGlove can bring that power to business for a fraction of the cost.

Alan: Welcome to the XR for Business podcast with your host, Alan Smithson. Today, we have a very special guest, Gijs Den Butter. He is the CEO of SenseGlove. Now, if you’re not familiar with haptics, we’re going to get right into this. It’s going to be awesome. But before we get to that, I just want to say, Gijs, it’s really a pleasure to have you on the show. Welcome to the show, my friend.

Gijs: Thank you so much. Real pleasure to be here.

Alan: It’s really, really cool what you guys have built. A little while ago, I had the opportunity to try haptic gloves, and I put them on and I was able to reach out in virtual reality and grab an object and feel that object in my hand. And I can tell you, it was one of the most incredible ways to connect the physical world with the digital world. It was an amazing experience. And I’m really, really excited to have Gijs explain us and walk us through SenseGlove and what they’re doing. Not only have you built haptic glove, but you’ve built a haptic glove that has force feedback. And so when you reach out and grab something, it stops in the shape of whatever you’re reaching. Like, just explain how you got into the where you are right now. Where did this come from?

Gijs: Yeah, I think this force feedback component is indeed the crucial part of feeling in VR, because you can have haptic feedback — like vibro-motors and those kind of things — but really the moment when you’re grasping an object and you feel that there is something that isn’t actually there, that is a key moment in what touch enables you in VR. And then you can really interact in VR, as you would do in a normal situation. So, yeah, with this belief, we started off in 2015 from a robotics group at the University of Delft — Technical University of Delft — here in the Netherlands. And we tried to get– to make a wearable that is, well, doing exactly this — so touch in VR — but was also affordable for every professional use case. We started firstly with a use case of rehabilitation, but we then found that this rehabilitation-only use case was a too limited scenario. And that was mainly because we were on a larger business fair called the Hannover Messe. And one of our current clients, Volkswagen, came to us and said, “Well, this training of impaired people, could you also do that with healthy people, so that they also can experience feeling in VR?” And that was kind of the start. We pivoted from a research group that was searching for a quest where their technology could be used in VR, to a company called SenseGlove. And that’s where we’re today. So in 2018, we launched our first product. That is really a development kit where researchers or R&D organizations — like within Volkswagen — can test, “OK, what does this component of touch add to my virtual experiences?”

Alan: How is Volkswagen using it? I mean, that’s a really, really amazing company. Volkswagen Group owns pretty much everything: Porsche, Audi, and BMW, and so on.

Gijs: As maybe the followers of this podcast know that Volkswagen is quite a progressive company if it goes down to VR. So what their two use cases that they’re interested in, which one of them is the training of assembly personnel inside of your environment. You can imagine if you are about to become an assembly worker in Volkswagen, you need to assemble those cars. The first day on that line is a pretty challenging day.

Alan: I can imagine.

Gijs: Nowadays you don’t have– well, you have some basic skills, you have seen some videos. But you’re just walking in that line and your task needs to be completed within two minutes. So on the very complex task, what they did, they created dummy lines or dummy assembly tasks, where you basically do these tasks that you do on the line, but then in a more chill environment, there’s no stress at all. But yeah, to create it is quite costly. So they thought, “Well, we could also do these trainings in virtual reality, rather than on this dummy line.” And they started doing use cases like pick and placing — or boxing — of boxes and picking and placing of parts. But as they were moving on, they want also– would like to have some more complex tasks added to this virtual training. But yeah, with these controllers, it was quite hard to– if you push on a button and then something happens that’s not really a training. But really, if you need to connect two plastic parts together or you need to drill inside a door panel, for example, that’s where it starts to become complicated and the task that you need to train. So that’s the reason why they asked, “OK, we need that next level of immersion. We need to have that component of touch into VR.” And that’s where they explored our option, that we were offering with this prototype on this particular business fair that we were on. And the other side is, if you’re in the design process, and you would like to know if, for example, your car part that you have been designing is actually able to be produced by your manufacturing workers or has the right ergonomics. Nowadays you just build a prototype, physical prototype, which is a very costly process and test your ergonomics, for example. If you could do that in VR and can have that similar behavior that a human has inside the virtual environment, that really is cost saving for companies like Volkswagen.

Alan: I’m looking at different pictures on your website, and the website’s senseglove.com. Now, these are not fashionable gloves. You’re not going to be walking down the street wearing these things. They’re a big blue frame and you look like a little robot. But in a research or in a training scenario, who cares? What are some of the other use cases that people are thinking. Obviously, training is an easy one. It’s a no brainer. “I need to train somebody or check out the ergonomics.” What are other some of the other use cases that maybe you didn’t anticipate that people are using this for?

Gijs: A wonderful use case I really liked we never would be thinking of: it’s one of the cases that Proctor & Gamble did together with us at Proctor & Gamble Health. And basically they had a idea of getting awareness of nerve ending disease, of nerve ending problems. Well, you can compare this to, for example, Parkinson. And they want to get empathy and awareness of this particular disease among general practitioners, but also on the general public. So they approach us and say, “Hey, you can create touch in VR, but can you also create touch that doesn’t exist in VR?” We said, “Yes, of course we can.” So what we did, we created an environment for them where healthy people like you and I can experience how it is to have that disease. So first you just do your scenario without any symptoms, and then you repeat that same scenario with the symptoms of this nerve damage disease. And then while you feel some tingling in your fingers, the resistance stops from the glove. So that causes that most of the people will drop the object that they have in their hand. And so really, you can get a feeling of how it is to have this disease. And while for the general public, this was meant to become like an ah-ha moment, “Hey, this is a really extreme version of what I experienced today. So I might need to visit a general practitioner.” And also for the GP’s, it was a awareness moment of “This is actually what my patients are feeling. This is what we’re helping them with.”

SenseGloves in motion

Alan: I think some of the best use cases of VR and AR technology end up coming from the end users. Not the people developing the technology at all, but the people that say, “Oh, this technology is great for X, can we use it for Y and Z?”

Gijs: Yeah.

Alan: We had Oracle, a gentlemen from Oracle, Sikaar [Keita]. He actually built an IoT sensor connection between our MetaVRse engine and this Bluetooth IoT sensor. And then yesterday, the new Airpods have 3DOF controllers built in, so they have accelerometers built into them. So somebody built a link between Unity and the Airpods, so they can turn their head, and it turns the 3D model in the screen. There’s a lot of cool things that you don’t think of with that. So let me ask you a question. This came out of TU Delft, which is a technical university in the Netherlands. Is the IP out of a lab, is it part of the TU Delft IP library, is this completely proprietary? Have you raised capital? Maybe talk a little bit about that, and how you’re funding this, and what your plans are for the near and far term futures?

Gijs: Yeah, yeah. So I started this company as one of the co-founders together with my former supervisor at TU Delft, so it’s really the technology that me and my co-founder developed, although we developed it under the umbrella of the TU Delft. So well, we have a wonderful system here that you can get the IP as a company and the university will become a shareholder in your company. So that’s what happened with us. And when we spun out that technology, also, we got a convertible loan from the university itself. So we had some start capital to build our first initial product, which is — to me — a really good way of developing technology that comes out of university labs. And there with that loan, we were able to build our first prototypes and to build the first stage of the company. And back in 2017, we got some angel funding. We got also a new co-funder into the more experienced entrepreneur into our board. And last year — so last November — we did our Series A round. So we’re a venture capital tech startup at this point, although the European venture capital landscape is quite different from the American.

Alan: You mean you didn’t raise $25-million Series A?

Gijs: No, no. It was not a $25-thousand Series A. In the European Series A you have to be between one and five million. That’s basically the European Series A.

Alan: Awesome. It’s so great for you guys to hear that. And you’re delivering these, right? You have these kits, these developer kits, right?

Gijs: Yeah. So indeed what you mentioned, our current product is really a developer kit. So it’s not a finished product. It’s a product that researchers, R&D organizations can already test, “What does touch in VR mean for us?” And we especially keep it at a price point that’s affordable for every business use case. At the same time, we can learn from our clients. So if our clients are developing VR trainings — or developing these wonderful use cases, like Proctor & Gamble did — we know what to improve. It works both ways. We can get a little bit of money out of it, because we sell these devices. The industry can already test what does touch mean in VR, and we get information in order to build better products for the industry.

Alan: What you guys have done is you’ve basically built a system where you can touch in VR, you can touch in these virtual experiences. I think what your– what was it, $3,000 for the developer kit?

Gijs: Indeed.

Alan: Now how much– because I’ve also tried the HaptX gloves, which must be considerably more money, because they come with hydraulics and heat and cold and like. Yours is more the simple aspect of touch and force feedback, which– I don’t know that you need all the other bells and whistles, to be honest. But having not tried both, would be interesting to see a comparison. How would you compare the SenseGlove to the HaptX in terms of performance, feature sets, and then maybe price as well?

Gijs: Where HaptX has as a vision to have touch as close to feeling in reality as possible, we take the minimal components of touch, in order to let you behave natural, so as you do in real life. That’s so different. So it’s rather like immersion, but we have versus realism that HaptX has as a vision. For us, basically, there are four different ways of digital feeling. The first way — to us that’s the most important way — is the way how you feel the shape and the density of an object. And that you do with force feedback, so the resistance on your fingers in a virtual environment. The second way is where you feel textures. And that can be done by a simple vibration motor that’s also in your smartphone. So there you can create ideas of buttons there, you can create ideas of rough and smooth surfaces. And then there is a third way, and that’s where you feel edges and like small roundnesses and then roughnesses, those kind of things. And that’s where you– there’s a term called skin deformation. So that’s the delta between the pressure that is given on your skin. And that is where HaptX really is focusing on. So HaptX has wonderful technology where they have air pressure bubbles or liquid pressure bubbles — I think currently it’s pressure — where they put all kind of bubbles on your skin and deform your skin. So the difference between our technology and the difference between haptics technology is that, with the technology of haptics, you could feel edges of an object, and with ours, not. That is because we think that it can also be levelled due to the visual aspect that you have in this virtual environment. Sometimes people ask us and say, “OK, if I close my eyes, I don’t feel I’m holding an apple, I’m only feeling I’m holding a rough round shape.” That’s indeed what it will feel with the SenseGlove, because we think that the combination of your visual aspect, that you’re seeing an apple and you’re feeling a round object that will trick your brain, “Hey, I’m holding an apple.” Whether with, for example, a HaptX setup, you would come closer to the idea of feeling an apple, rather than a rough round shape. And that’s the difference in technology. Obviously, that comes with a price point. I think the latest estimation, what I saw is that a HaptX system will cost you €30,000, up to €100,000. I’m not sure where they are at this point in the market. And the SenseGlove is a €3,000 device that you can buy, and plug and play, and start using the day after you bought it.

Alan: I actually had a really great experience with HaptX gloves, and I’m really excited to to try these SenseGloves. At a factor of price of 100 times cheaper, at least 10x cheaper, that’s a great way for researchers to start to learn the benefits of touch within a virtual environment. I think it’s really exciting. How can somebody try these? They can go to senseglove.com and order a pair of these.

Gijs: If you’re indeed a researcher or if you have enough budget, it’s the most easy way to order it, via our web shop, or send me an email. Yeah, that’s the most easiest way to try out. It’s only like for €3,000, you’re already getting started. For some other projects, hopefully when we can do business fairs again, we’re [at] most of the VR business fairs, like the WE. We’ve been to Eurohaptics lately. We were on CES. Also please keep an eye on our website, where we are. And if really you have a very interesting use case and doubting how essential is the device, we also have some devices that we can borrow out.

Alan: You talked about Proctor & Gamble, we talked about VW. Is there any other companies that you’ve seen that they come up with a use case that you went, “Wow, that is super cool.” What are some of the fringe use cases for this?

Gijs: The Proctor & Gamble one and this one is really cool in these terms, because you really experience something that you would hopefully never experience in your real life. We had some strange research projects where there were typical– now, mostly on the side of having a disease and being able to empathize on that disease or overcome the disease and for example, rehabilitation use cases. But as I’m a very [big] believer of augmented reality, I also would like to mention one of the use cases that a– well, not a research institution, in Germany did, Fraunhofer. They combined SenseGlove at an augmented reality set up. And then while you still have that bulky blue exoskeleton that you see through these augmented glasses, but at the moment you actually can grasp the virtual hologram that is in front of you. And in this case, this was a design of a headlight. You really feel, “Okay, I have something here.” And you really believe also that that augmented headlight casing was there, that really is also cool.

Alan: So you’re in the real world, you reach out, you see a virtual object, and you can grab it and touch it. Ohhh!

Gijs: And then your imagination can go wild. So we could have this podcast and have your hologram next to me. We could high-five. We could shake hands.

Alan: Yeah, the virtual handshake.

Gijs: That’s what this technology can bring us.

Alan: This is super cool. Now, I have to ask this now, my mind is racing on this. Do you guys have plans to miniaturize the technology in any way, shape, or form, to make it so that you’re not staring down the big blue exoskeleton?

Gijs: Yeah, yeah. That’s definitely one of our big goals. Yeah. With force, you need to bring your force to somewhere where it can land on the physical world, because otherwise, yeah, you don’t feel any resistance. So it will always have some form of plastic inside the glove system. But yeah, to be honest, we are working on miniaturizing our SenseGlove, because this is one of the biggest demands of our clients.

Alan: I just read an article about diminished reality, how basically they’re using augmented reality to erase parts of the world. So if you’re looking at something, you can actually take it out of the world real time. I wonder if you could just put some sort of tracker, like maybe a QR code on the back of the gloves. So as you’re using them, it actually understands that whenever I see this QR code, take it away, take it out of the scene, so all you see is a pair of hands. Even though it’s there as the gloves, and everybody in the world can see it. But when you’re playing with it, all you see is a pair of hands, digital hands.

Gijs: In VR, obviously, that is–

Alan: Oh, VR is easy, but AR is the hard part.

Gijs: In AR, that’s different. But also the downside of an exoskeleton is the weight balance that you have. If you do really, really small tasks, sometimes your exoskeleton collides with each other. So there are limitations to the exoskeleton, besides the visual part of it. So, yeah, we’re really working hard to miniaturize this. I won’t say too much of a detail of how we will do this, but expect that from our company in the upcoming years.

Alan: Well, I’m certainly really excited for the future of SenseGlove and haptic technology. The second I put the HaptX gloves on and had that experience, I was a full believer. And I will never forget that experience at CES, where I reached into a fire and it caught me. And these are memories burned into my brain. So I believe that using the combination of VR and haptic gloves and the SenseGloves for training, I believe that you can actually start to give people — I would say — probably 70 to 80 percent of the real world training that they’re required. And the cost savings are just astronomical, when you start to realize what can be done in virtual worlds, especially on things that are dangerous and expensive. Maybe it’s running three shifts a day. You can’t train people on it, because it’s real. Time is running and never stops. So how do you train people on a line that never goes down? Then they have to learn as they go. This gives them an opportunity for mastery long before they ever step foot in the factory floor.

Gijs: Indeed. And I think with haptic technology, you can indeed bring that real live realism into VR. And what’s key about that is the KPI of mistakes that people make on their final exam, or on their final– first day of work. And I think that’s also a really good way of measuring the effectiveness of your VR training. Compare, for example, the amount of mistakes people make on a real life training. Compare that with the amount that they make on training with controllers. And then compare that also with the training with haptic gloves. You’ll see that the training with the gloves is more comparable to this real life training than when you have these controllers. Nice example here is one of our clients. They were training in VR how to replace fuses in some kind of fusebox environment. The action they needed to do in VR to replace that fuse was to pull a trigger button and then the fuse came out of the box. But in real life, you needed to push the fuse, turn it a little bit, and then pull it out. So the guys that were trained in VR with the controllers, they knew, “OK, I need to pull a trigger, so I pull the fuse,” and the first time they did it, they actually broke that box, because they were trained wrongly. But if you would have had a haptic glove, you would’ve really needed to push first, then turn, and then take out that fuse. And you would be trained in a life-like situation. So that is really where haptics can force you to have that muscle memory of how the real-life situation would look like.

Alan: And that’s really what it comes down to. What’s the point of building a photorealistic VR training, if you’re not going to go all the way and train it right down to what you need to do? Gijs, is there anything else you want to share with the community, with the podcast, how can people get a hold of you?

Gijs: Yeah, the best way is to reach out via our website, via a contact forms – we have dozens – it will eventually end up in mine or my colleague’s e-mailbox. I’m happy to answer all your questions. But also really try to see where is the haptic– where’s haptics at this point? Because it’s really– at this point in time, I think it’s just one to ten percent of what actually can be achieved with haptic technology. It’s the new field of research in VR. So we know what VR, what the visual can mean for us. And now it’s really time to also see, “OK, what is this next level of immersion?” So things like spatial audio, things like motion capture, but also really haptics is one of the key features in order to make VR more immersive. And I think we have a product on the market that you can today already experience. And also think together with us where this new field will go to.

Alan: I thought that was really interesting on your website, you had a section where for the price, for example, you have €2,999 to buy the developer kit. But then €3,999, and you get time with your developers to help build something. That was really cool.

Gijs: So we have four businesses package, where we can create your first haptic use case together. So this is really for businesses that have already a VR training, for example, but lacking that immersion of touch. And then we help you to actually get the SenseGlove into your virtual environment, into your Unity environment. And we see that this 40 hour development package is — most of the time — enough to create that first proof of concept for your first haptic training. And also for the other side of the research is we have something on the website called a warranty package. This is maybe a wrong term here, but it usually is some support from our side that we help you in order. When you have questions on our SDK, we have developers that built our SDKs and are happy to help you over phone or over some sort of lesson support gear. So, yeah, we really want to work together with our clients to create the future of haptic technology.

Alan: Well, Gijs, thank you so much. We’re so glad to have you on the show, and if anybody is interested in seeing how haptic gloves can really transform their training — and beyond, really, beyond training, it could be anything — I think adding haptics to a virtual experience really does complete the circle of immersion. And when you have that combined with spatial audio and great visuals, it really does create this full package immersion. And I think this is the future of learning, of training, and of probably marketing and sales in the future as well. So thank you, Gijs, for taking the time with us. Everybody, if you’re interested in learning more, you can go to senseglove.com. And yeah, we’ll continue to follow up with Gijs and team, and we’ll see what’s next in the world of haptics. So thank you very much, Gijs.

Gijs: Thank you so much, Alan, for having me. It was a pleasure.

Alan: My pleasure. And that is the end of the XR for Business podcast for today, your host, Alan Smithson. Don’t forget to hit the subscribe button so you don’t miss any episodes. And if you want to have the transcripts of this, we transcribe all the episodes at xrforbusiness.io. Again, thank you so much for joining us, and have a great day.

Alan: Oh, wait a second! Gijs, I have one more question. I almost forgot to ask it. What problem or challenge in the world do you want to see solved using XR technologies?

Gijs: Yes, so the problem that I would like to be solved is that we’re getting away from these 2D interactions in computing. So as we’re now typing on a keyboard, as we’re now– always with our face down, looking into our smartphones. Yeah, that to me isn’t really an interaction. That’s just you immersed in a computer screen. Well, with XR, you finally, well, can get up again and interact with the digital environment around, like you would do– as you would [if you did] not have any idea of the existence of computing. And that’s also really nice– there’s a nice metaphor. We once had the grandmother of one of our founders here, old lady, and she never had done any iPadding or computing before. We put her in VR, we put her on the SenseGloves, and she knew how to interact with the digital environment. And if you then sometimes compared it to really expert computer users, where they’re used to clicking on buttons and those kind of things, who are way too fast for like, let’s say, grabbing things, and expect that just putting your hand next to something will enable the grasping motion. Yeah, that’s really cool to see that, that a grandmother can actually intuitively interact in this virtual environment, that sometimes the so-called expert users are expecting a different type of interaction. So really, the problem that I would like to solve — or being solved — with XR is that we again come to normal interactions in this digital environment, rather than fake 2D interactions.

Alan: I love it. It’s a really great vision. And I think as we move into spatial computing as the new normal, this will be the way we interact. And you are already seeing it with Facebook’s hand tracking, and it’s all coming. And actually, I wanted to give a quick shout out to XR Boot Camp. They’re our partners, and they just finished a master class on hand tracking and programming for hand tracking. And I think there’s a great synergy here, where they can maybe start to teach how to create interactions using the SenseGlove as well. So I’ll make an introduction.

Gijs: Definitely. Thanks.

Alan: Thank you so much, Chris.

Gijs: And also indeed — like on the latest Facebook Connect, the new name — indeed, one of the questions that they had as well is how to interact in this new domain of spatial computing. So in this example of the mouse in 2D, that was what was given, what is the next thing in 3D? And I think that is a really nice challenge that we can solve as a community over the upcoming years.

Alan: Absolutely. It turns out we have built in mice. We got these things called “hands.”

Gijs: [chuckles] Indeed.

Alan: Well, thank you, Gijs. I really appreciate it.

Gijs: Thank you.

Looking for more insights on XR and the future of business? Subscribe to our podcast on iTunes, Google Play, or Spotify. You can also follow us on Twitter @XRforBusiness and connect with Alan on LinkedIn.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top