Tricks of the trade: Making VR projects a reality

6 Mar 2023

Image: © Seventyfour/Stock.adobe.com

BearingPoint’s Carl McDermott speaks about using the SDLC approach to manage metaverse projects and his predictions around the future of VR.

Click here to view the full Engineers Week series.

The virtual reality (VR) sector has pushed forward in recent years, with the concept of the metaverse becoming more mainstream.

Meta revealed a new headset last year and will likely announce another in 2023, though the company’s VR push took a toll on its finances last year.

In Ireland, Engage XR launched its enterprise-focused metaverse last year, which lets businesses create virtual worlds to engage directly with clients.

As organisations explore the potential benefits VR can offer, tech consultancy companies such as BearingPoint are developing the technology to help make these ideas a reality.

Carl McDermott is a manager in BearingPoint’s software practice, currently specialising in VR and metaverse enablement.

Speaking to SiliconRepublic.com, McDermott said creating VR experiences on a game engine can be “very different” from the day-to-day software engineering he had been accustomed to. He said software such as the Unreal Engine requires a “lot more time, resources, head scratching and tweaking” than a regular C# app.

However, he said there are also similarities between the two when it comes to processes and said having an object-oriented programming mindset about blueprints or creating functions in C++ “definitely helps in the long run”.

“The software development life cycle (SDLC) is one that can be followed while creating experiences too and the team’s strong knowledge of agile methodology helps us create experiences quicker,” McDermott said.

Creating VR experiences

With the various potential applications for VR, two clients can want very different end results when developing their own VR experience.

McDermott said the SDLC is followed when helping businesses create their experience, which begins by “gathering requirements”, along with obtaining or creating 3D models and finding out what functionality they want “such as avatars, interaction with objects or products, NFT integration or process workflow training”.

“Completing the requirements gathering stage sets us and the client for success and we can continue with the development phase to bring it to life and continuous, functional, user testing with the client to ensure their vision is coming to life,” McDermott said. “Finally we would then perform any maintenance required for changing assets or adding functionality at a later stage.”

A key example of this process being put into action is the VR experience BearingPoint has been developing with Leinster Rugby, to help get the sporting group ready for the metaverse.

McDermott said Leinster Rugby’s requirements were “clear to begin with”, which in simple terms was taking a model of the Aviva that they provided and allow them to access it in VR so they could host virtual press conferences.

He said the project’s requirements were split into “manageable chunks”, such as refining the 3D model, preparing the environment’s lighting, creating a multiplayer system and allowing speech between participants.

“This approach allowed us to split the experience into different areas that each developer could work on, and then bring them all together in a single map,” McDermott said.

While the process can be made manageable through proper planning, McDermott also said various challenges can be encountered during the development process.

He said it can be “daunting and cumbersome” to work with new 3D models, while certain requirements such as networking and multiplayer can be tougher than expected “when experiences are expensive on resources”.

McDermott also said speech and audio can be “extremely resource intensive”, which is an issue that grows “very quickly” as new players are added to a virtual experience.

“Using a central git repository like Azure DevOps can be difficult if you are primarily using Unreal Engine’s declarative blueprints due to its proprietary nature,” McDermott said.

The future of VR

While McDermott doesn’t believe everyone will have “a pair of headsets glued to their faces in the next few years”, he does think workplaces will have “some level of consistent interaction” over the next five years, due to the various developments taking place in the VR sector.

McDermott predicts that there will be rapid innovations in the future, with companies such as Meta releasing new headset models each year. He said these innovations will likely lead to smaller-sized devices “with higher processing and graphical power”.

“I think the rise of XR will be more apparent and focused in the next three to five years with VR being a part of that XR experience driving true innovation for people and companies.”

McDermott also expressed excitement around generative AI and the potential applications it presents from both a business and VR perspective.

“3D models are extremely time-consuming to create and having a well-established 3D generative AI would be pretty cool,” McDermott said. “We’ve also started thinking about using a natural language model as an AI character in our VR experiences to help people in a natural way if they have questions while in there.”

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com