2 min read

How the Metaverse is Changing Live Sports Broadcasts

Sports in the metaverse is a vertical that I think holds a lot of overlooked potential in the near term. Sports training in VR has seen multiple interesting consumer applications. But the lower-hanging fruit is live sports broadcasts in the metaverse. Not only is there huge fandom and obsession over live sports, but the sports viewing experience largely hasn’t changed over the decades.

Granted, broadcasting rights are tricky to bypass as a general innovator. Which is probably why we haven’t seen new experiences here. Still, we’re in need of something new, and the viewership figures prove that.

The NFL has done a few experiments like the AWS partnership, which shows data visualizations of player speeds and play probabilities during NFL games. The NFL Slimetime broadcast with Nickelodeon last year showed a simulcast where on-field action was animated in real-time with cartoonish expressions. But nothing has really stuck and made a lasting impact.

The Netaverse that the Brooklyn Nets showcased last NBA season is still the most impressive metaverse application for live sports.

The Netaverse is a VR simulcast of Brooklyn Nets games. But it’s not just a big screen in VR. They’re literally scanning and rendering the actions of all 10 players on the court in real-time. In other words, it’s like a real version of an NBA 2k video game [allowing viewers to walk anywhere on the virtual court during the action and witness the game in an infinite number of ways]. – NFT QT

AR at the World Cup

The Netaverse was the most impressive application of live sports meets the metaverse until I saw the following in-person AR tech at the Qatar World Cup 2022.

The FIFA+ app has an augmented reality feature that allows fans to view the likes of VAR replays and alternate camera angles. FIFA says that only people who are attending matches in person can access the FIFA+ Stadium Experience. According to a video that has gained traction, users can point their phone's camera at the pitch. An overlay will pop up that enables them to tap on a player to see things like their movement speed and individual heatmap. – Engadget

This is more than an AR filter because it changes with contextual data, like who’s on the field and how their stats have changed. This gives us a glimpse of what’s to come with event-wide AR visualizations. Today, the AR experience delivers player stats and motion heat maps of the action. Tomorrow, this AR experience will highlight plays as they unfold, animate great maneuvers in real-time, and a variety of other mesmerizing visual overlays.

Niantic’s Visual Positioning System is making millimeter-level location accuracy possible, which means that creating reactive AR visualization experiences like this for live events (sports, concerts, etc.) is probable.

The trick to delivering this high-bandwidth metaverse experience to thousands of people at once is designing a data system that processes a lot of the real-time information locally. With hundreds or thousands of people pointing their cameras around a stadium or venue, we can create a real-time 3D map of the action from all angles. But this is a lot of data to send back and forth, especially if we’re adding AR overlays too.

Maybe developers could design a mesh network for all of this data to be shared between phones – alleviating some of the bandwidth pressure of processing this data for the AR experience. I’m not entirely sure.

Producing a metaverse experience for live sports is just as much a technical problem as it is a creative problem. Nonetheless, I view it as a major metaverse onboarding point if done correctly.