Re-Defining Live: Mixed Reality & The Music Industry

January 30, 2023

Have you ever wondered why, for the past 100 years, the standard pop song sat around the 3-4 minute mark? Or why Eurovision has a strict 3 minute limit on all song entries?

What might seem like an arbitrary standard is really the result of technical recording limitations. Early 78 rpm phonograph records could fit 3-5 minutes of music per side, with audio fidelity lowering the closer the recording gets to that 5-minute mark. Throw a second song, instrumental version, or dance mix on the other side and voila, you have the single. When 45 rpm vinyl allowed for longer play, the 3-5 minute per song standard persisted, resulting in an entirely new concept known as the album.

When vinyl ruled, so did vinyl sales. Artists have historically relied on two revenue streams: physical sales & touring. As audio formats changed and became cheaper, album sales began to taper off until streaming permanently changed the landscape of the music industry. Even today a mainstream artist like Beyonce earns around 88% of her revenue from concert sales alone. In recent times, video streams have begun replacing physical sales, as Artists like Drake earn upwards of 35% of their revenue from advertisements linked to their music video streams.

The takeaway here is that the format that music is recorded, distributed, and consumed has always had a reciprocal relationship with its creation and enjoyment. Even today we see the phenomenon persist with streaming algorithms and TikTok dictating song structure, and virtual performances replacing live touring.

Where our interest lies, and what this article will explore, is the use of immersive technologies in music. Specifically, how it has (and will continue to) radically change the way we consume music and interact with artists.

Virtual concerts hit the mainstream

The term ‘live performance’ has been somewhat redefined over the past two decades. While it wasn’t until the pandemic hit that most music consumers first heard the phrase ‘virtual concert’, the origins of this idea can be traced as far back as 1998 when K-pop behemoth SM Management attempted to create a holographic performance of their band H.O.T. 

A little over a decade later, they succeeded in their goal by broadcasting a performance by supergroup Girls Generation to thousands of fans in Gangnam district. The West soon followed with a controversial performance of their own as in 2012, a holographic Tupac Shakur appeared next to a real-life Dr. Dre at Coachella.

From 2013 onwards, we see further development of the ‘virtual concert’ by the Korean music industry, as well as by various other subgenres who took to popular online games like Minecraft, Second Life, and Roblox to perform live DJ sets and performances. Epic Games’ Fortnite (created in Unreal Engine) has hosted some of the most memorable video game concerts, with Travis Scott’s virtual Astroworld tour reaching over 27.7 million unique players across 5 performances. Driven by lockdown conditions around the world, the past few years saw the likes of Wave XR collaborate on virtual concerts on social platforms and in gaming worlds with John Legend, Justin Beiber, The Weeknd and others. However, it wasn’t until ABBA announced their Voyage series of physical avatar concerts that the rest of the world caught up with the hype. 

In the past few years, Dimension has worked with artists like MadonnaChildish GambinoMichael BubleColdplay, and BTS to help usher in the technology that allows for these performances to happen. Back in 2016, our team collaborated on the award-winning Childish Gambino AR performance PHAROS. The work involved building a virtual world to be displayed via real-time in-engine, 360-degree CGI immersive projections appearing on the walls of the dome-shaped venue. In short, it was an AR experience designed to elevate the narrative of the live performance.

In recent years, we’ve used our volumetric capture technology and Unreal Engine’s Metahuman for virtual performances. The Coldplay x BTS video for their chart-topping “My Universe” featured holograms of the artists, seamlessly blending the bands together for an interplanetary experience. The volumetric captures of the artists were used later in a live ‘The Voice’ broadcast, bringing mixed reality to TVs around the world using real-time technologies and garnering over 250 million views. In much the same way, we used volcap to bring Madonna’s 2019 Billboard Awards career-spanning performance to life by having various virtual humans appear alongside the pop star. 

Last year, K-pop’s largest export BTS grossed an astounding 90 million USD over 3 days of virtual concerts. Streamed both online and in movie theatres across the world, the shows reached 2.5 million fans in over 190 countries.

Whether it’s through creating virtual humans or using virtual production techniques to create and manipulate worlds for the use of live performances, the ability to use this technology in real-time provides an incredible window of opportunity for the music industry. Virtual performances can’t sell out, can be repeated daily with minimal overhead, and offer more control and creative freedom to artists and management to tailor the live concert experience.

I Want My (Virtual) MTV

What the phonograph record did to music in the first half of the twentieth century, MTV did to the latter. The advent of music videos radically changed both the consumption and marketing of musical acts. In the past, the album cover was the only visual accessory to an album. With music videos, artists could now define themselves visually rather than just musically, and with this new medium of communication came everything from celebrity cameos to product placement and viral dance moves.

Sticking with our theme, what role do immersive technologies play in the creation of music videos? To answer this, we’ll offer up a few recent examples from across the industry.

On May 8th 2022, Kendrick Lamar surprised everyone by dropping a new single. ‘The Heart Part 5’ was a dramatic return for an artist coming off of a 4-year hiatus. What stood out to most listeners, however, was the music video. In it, Kendrick stands alone with a red backdrop and, throughout the 5-minute video, his face warps into various recognizable figures from O.J. Simpson to Kanye West. While synthetic or 'deepfake' technology is common nowadays, seeing it used to elevate a piece of art without overshadowing the art itself was an incredible triumph. 

In slightly more uncanny use of mixed reality technologies in music videos, we turn again to South Korea and the use of emerging technologies by K-pop bands. By utilising various synthetic, artificial intelligence, and avatar creation methods, groups like Pulse9’s Eternity can sing, dance, and act with virtual faces being digitally plastered on anonymous, interchangeable bodies. 

Finally, we’d be remiss to discuss mixed reality in music videos and not bring up Snoop Dogg’s ‘Crip Ya Enthusiasm’ featuring Unreal Engine’s Metahuman technology.

So what about us? While we’ve already highlighted some of our virtual humans’ work with Coldplay and BTS, there’s a more recent example that truly showcases what's possible with mixed reality technology in music videos: Sad Night Dynamite’s ‘Black & White’.

Dimension worked with Blink and SND to create virtual humans out of both band members. While the capturing technique was identical to how we recorded Coldplay, the director Lucas Hrubizna took the assets in a completely different direction and, in tandem with other mixed reality techniques, created a wholly unique and innovative piece of visual art. Fortunately, we’re not the only ones that think so. The music video swept the awards circuit in 2022 winning at the UKMVAs, Ciclope Festival, and The 1.4 Awards.

The Virtual Sky’s the Limit

It’s difficult to predict what creative doors virtual production and virtual humans open up for artists because, unlike other technical innovations, it offers nearly limitless possibilities that extend well beyond the realm of virtual concerts and music videos. In 2020, Dimension used volumetric capture of Sam Smith to produce a holographic AR projection of the singer for the first 3D album artwork on Spotify. In 2018, we created a similar AR experience using volumetric capture of Canadian singer Michael Bublé so that fans could take selfies and videos with the virtual star. Both are prime examples of the technology being used for contemporary fan engagement.

Mixed Reality is here to stay; that much is clear. What interests us most is how far artists can take the technology, pushing the boundaries creatively to redefine the musical experience. Dimension carries all the resources and expertise needed to usher in this new generation of virtual production and virtual humans in music. Get in touch or stay tuned to find out more.

Check out more of Dimension's projects in virtual worlds, virtual humans and virtual production