Miami University, the Ohio-based, liberal arts college and public research institution underwent a recent expansion of the university’s facilities. The McVey Data Science Building opened in March 2024, housing The Department of Emerging Technology in Business & Design (ETBD). A state-of-the-art Immersive and Reactive Lab and XR Stage is one of the facilities within the new building, home to an impressive 50 by 70-foot stage and a 44 by 16-foot ROE LED wall. The integration of advanced technologies, including Stage Precision software, has helped pioneer the new lab as a hub for exploring the boundaries of virtual production (VP) and extended reality (XR) experiences.
“We have around 70 students enrolled on courses here for VP and XR,” explains Benjamin Nicholson, assistant teaching professor and Immersive and Reactive Lab and XR Stage Director at Miami University. He continues, “The groups are learning everything from motion design and creating creative visuals for live music. They utilize immersive and reactive tools such as Notch, TouchDesigner and Unreal Engine to make virtual production stages; this is also where they are learning how to use SP from Stage Precision.”
Nicholson Praises Unified Workflow of Stage Precision in Immersive Lab
Nicholson has been an advocate of Stage Precision workflows for several years, having discovered the software during previous work on live events projects. He says, “In the industry, people are talking about Stage Precision and the things that can be achieved through the unified workflow it provides. In the context of the lab, SP allows us to take the control and management out of several individual native softwares and hardwares and put them all into a single interface that can be used for calibration and control.”
The environment at the Immersive and Reactive Lab includes a ROE LED wall, Disguise media servers, nDisplay workflows and other media pipelines for two cameras. The main camera is a RED Komodo with six Zeiss Prime Lenses and a Canon Zoom Lens. “The most pivotal thing about SP is the lens calibration features. We built lens profiles in SP which take the data input from RedSpy for optical tracking,” explains Nicholson. Camera two offers another great example of the ease of integration of different systems into SP.
“Camera two is an interesting use case of Mars tracking, so a completely different tracking system. Thanks to the agnostic nature of Stage Precision, we can integrate both cameras one and two, two very separate and different tracking inputs, into our SP workflow.”
Miami University Pioneers Creative Production Training with Stage Precision
When it comes to training the next generation of creative production professionals, the facilities at Miami University are pioneering a new way of working that leverages the freedom and flexibility made possible in SP. “For the learners that can already understand the significance of a tool like Stage Precision, they’re enthusiastic about using it in different ways,” says Nicholson. “It’s an advanced program, but training in this from the start will give students a high level of knowledge and understanding that they can use in the real world.”
Currently, students at the lab are experimenting with a TouchDesigner VP workflow, using SP as the hub for feeding lens and tracking data into the workflow. “At the end of the semester, the students will do a VP demo using this set-up,” says Nicholson. “My Capstone Classes are already using Stage Precision on real-life projects with actual clients.”
The versatility gained through having a single source of truth in SP sets the new facilities at Miami University apart. “With SP we can run several parallel systems at any one time. We can run a Disguise virtual production, set up a TouchDesigner system or anything else, run them at the same time and switch between them,” remarks Nicholson. “We have multiple users who can change the SP interface from up to 15 different computers. What Stage Precision is doing is providing a single point of tracking distribution to all the different media servers at once.”
One of the first VP and XR university facilities to integrate SP into the heart of their curriculum, Nicholson believes that learning these skills will benefit students after graduation and throughout their careers. “SP has removed the need for technical calibration of a single source and allows students to learn the process outside of a native software package, making it a helpful foundational knowledge base that teaches skills that are transferrable across different software and technologies.”
A close relationship with the Stage Precision team has also been an ongoing support in the integration of an SP-based workflow at the lab. During the onboarding process, Axel and Tony from the Stage Precision team visited the lab to help configure the proper workflow that would most benefit the projects and learners at the university.
“SP allows us to have visibility over things that may or may not be happening in the background network, whether it’s time code, tracking or any sort of calibrations,” concludes Nicholson. “Combined with the ability to build custom interfaces so you have complete control over your space and your show, there’s nothing else on the market that can do that. With SP, our systems feel manageable, stable and controllable.”