The world of VR and AR is one that is emerging, slowly interweaving itself amongst cultural conversations amid the zeitgeist.

Copyright by


SwissCognitiveHowever, in the ways that brands and people consider the integration of technology in their everyday life, London-based tech firm M-XR (formerly Mimic-XR) has intently surveyed the arena and believes the industry has not hit peak maturation. Imploying their forward-thinking prowess, M-XR is using machine learning and 3D capture technology to import the real world into the digital realm as a means to develop inventive experiences to suit the needs of companies to creators. Recently, speaking at Samsung ‘s Design Unfolded experience, alongside debuting its immersive project with BYBORRE , HYPEBEAST caught up with its founder Elliott Round to discuss how he and his creative partner established the studio as a means to change how people will interact with the 3D world in the near future.

Can you give some background behind what M-XR is?

M-XR is really interested in the 3D space, so VR and AR — also anything 3D from live visual effects, films and games. We’re really focused on how can we make this industry more user-friendly, in terms of the creation process. At the moment, it’s very tedious: It’s a lot of manual work. And as a result of that, the creative tends to get forgotten or the budget’s so high. So, we’re really interested in how can you use tech and specifically AI to build tools that can almost automate a lot [of] this process to then empower the creator.

How did you find your way into working with VR and AR?

I was working in the film industry, [and] we moved into doing 360 and that was quite fun. But, you couldn’t really interact with it. That’s when I started playing around with a program called Unity. We started with some interactive work and you could pick stuff up, you could interact with the characters. It stopped being film and it was more like you [were] transferred into space and environment. And that was really exciting all of a sudden, but I was never [into] 3D games until that point; I thought it was a bit geeky, and I was like, “Okay, you can do some cool stuff with this like live music, events, exhibitions — all that kind of stuff.”

But, the problem was [that] it took so long to do anything, and, even then, it still looks a bit kinda crappy. So I kind of took a step back, and I was like, “If this really wants to pick up, how can we move it forward?” And the thing that seems to be the biggest problem was how would you create the content that goes into these worlds? I started looking to photogrammetry, which you’d just take a bunch of pictures and make some models. But the problem is, it isn’t real because it doesn’t have any textures and materials and suede, leather or plastic wouldn’t react to light differently at all.

Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


So that’s when Ryan and I founded MXR, originally called mimic. And we started looking into how could we actually acquire these materials. So when we do a scan, we have all the properties of how it reacts to light. The technology has been progressing-and-progressing over the past couple of years. And now we can really capture a whole variety of material properties when we scan an object and put it into a digital environment, and if we light it the same way as the real objects, they’re indistinguishable. […]


Read more –