RIOT Games

RIOT Games

Munich // 2025

Stage Precision’s groundbreaking AR tracking software deployed by XR Studios & Riot Games

OVERVIEW
Stage Precision’s industry leading software first came to the attention of XR Studios’ Chief Technology Officer, Scott Millar, when he met the team at a trade show in the US in 2019. Although Millar had been working on XR and AR projects of his own since 2018, it wasn’t until the pandemic hit that XR Studios was formed and his work in the area was truly given the time and focus it required to flourish.

In the intense couple of years that followed, Millar has contributed to a number of groundbreaking events, including the world’s first live XR eSports event, HP Omen Challenge; Katy Perry’s trailblazing, XR-centric live performance on American Idol; and Billie Eilish’s WHERE DO WE GO? live-streamed virtual production. In short, the world has been watching what XR Studios does in their millions.

In September 2020, XR Studios was tasked with undertaking some AR work for Riot Games and its VALORANT Champions Tour. Although Millar had key experience with other systems, he knew that Stage Precision was working on a feature that could make augmented reality tracking as accessible as he needed it to be. The game, as it were, was about to change.

PROBLEM
XR Studios was required to run all video screens at the event, ideally from a central control point, with the ability to trigger AR effects and live API data from the game. The team also needed software that would allow them to trigger Unreal Engine, Notch, and their disguise severs at various points during the show and do so with the level of reliability required from such a complex and high-profile broadcast. For XR Studios, ease-of-use and calibration speed were also vital aspects to ensure that the show happened as quickly and confidently as possible.

SOLUTION XR Studios selected Bright to be its European partner for the Riot Games event and used Stage Precision’s Shield product to run the shows themselves, even paying for custom coding that would help Millar and his team to integrate with VALORANT’s Unreal Engine.

They chose SP primarily for its huge number of inputs and outputs and impressive tracking features, though its capacity to act as a central core for the whole production would prove itself to be incredibly useful.

“We’d looked at other solutions, but I think SP’s Shield allowed us to get closest to pure Unreal with a huge level of control on top,” said Millar. “Based on price, performance, and ease of use, I think it was the perfect choice for such a scalable and evolving workflow. Of course, it was risky because it was a new piece of software, but we got to grips with it very easily. We were impressed from the off.”

The triggering for the VALORANT event became so intuitive that the system even became ‘sentient’ to an extent, according to Millar, in reference to its ability to self-run the show at certain points.

He added: “There are a huge number of features in Shield, but we’ve been chatting with SP to see if they can build even more to help with speeding the whole process up, particularly in terms of camera and lens calibration. SP have got such a good grasp of these kinds of problems, and they always know how to solve them.”