The Metaverse Doesn’t Exist Yet — So How Do You Develop For It?

Experienced developers know the situation too well: an executive comes by with an idea, and it’s up to us to act on it. If a C-level has recently dropped by your office to say, “we ought to be looking into this metaverse thing,” you’re in for an adventure. Bringing a company up to speed when the goal is to unite all the 3D visualizations that could exist on the web is something your manager will call a “challenge” and the executive will call an “opportunity.” How do you develop something like that?

Start by recognizing that you aren’t alone. That same C-level has probably just come from a design group at your company that’s now busy looking into metaverse design standards and what they need to know to design 3D environments. They’ve probably already read David Truog’s Ten Principles For Designing the Metaverse. Those principles extend to software development.

Fusion: Skip Mixed Reality For Now

You’ve got a choice when developing 3D virtual environments. At one end of the spectrum is traditional virtual reality (VR), where you implement everything. That makes reproducing defects easier, since the world is yours to control. You’ll have to build an entire world, though, with everything that a user might see, hear and interact with.

Another choice is augmented reality (AR), where virtual objects overlay the physical world. Using the physical world as a base means less world-building for you, but makes testing much harder — if your office table is rectangular and made of metal, while a user’s table is square and made of laminate, you may not be able to reproduce a defect. AR is likely to be implemented on a phone rather than a head-mounted display, using a toolchain you’re already at least partly familiar with.

The third choice is mixed reality, a combination of VR and AR where physical and virtual objects interact seamlessly. Don’t go there — at least, not for your first project.

Empiricism: Get Cool Toys, Make New Friends, And Banish Nausea

There aren’t ways around it: if you’re developing for head-mounted 3D virtual environments, you need a head-mounted display. So do all the other developers on your team. What looks right in a projection onto a 2D emulator sticks out — literally — in 3D. Now is the time, with C-level endorsement, to get the hardware you need to build an environment that works for everyone.

And that does mean everyone. Your implementation of reality must work with all kinds of people. You don’t want to be known as the company that released a project that worked only for light-skinned people, or only for men. It’s vital to have different kinds of people testing your environment.

People respond differently to extended reality as well. Not everyone has the same tolerance for it. Especially with hardware latency, navigating in extended reality can cause motion sickness. With a head-mounted device, you’ll have to refresh your display within 20 milliseconds from when your user’s head turns, or… else. Now is the time to figure out how to avoid it — before your C-level puts the headset on.

Start Now To Stay Ahead

Know going into this that you’re likely to make architectural choices that aren’t settled for the metaverse that will be. However, you’ll be building the know-how in tools, frameworks and processes that you need to give you a head-start on those companies that are waiting to see what shakes out.

If you’re interested in software development in today’s precursors to the metaverse, read the report Building The Beginnings Of The Metaverse.

If your company has had experience creating or using frameworks to help software developers get a head start in building what will be the metaverse, reach out and let’s talk!

The Metaverse Doesn’t Exist Yet — So How Do You Develop For It?

Leave a Reply