XR Studio

Virtual Production at Emerson

Introducing the Emerging Media Lab. Hello Everyone. You may be wondering why I’m inside of this post-apocalyptic junk yard. Well, my friends, to answer that question, I’m not. In fact, I’m currently in front of this LED wall within the extended reality studio here at Emerson College. Welcome everyone to the Emerging Media Lab. Hello, I’m Eugene Kuznetsov and I’m the manager of the Emerging Media Lab. What is the Emerging Media Lab? The Emerging Media Lab is a multimedia creation facility dedicated to supporting the character design, computer animation, game design and development, interactive media, visual scripting, extended reality and virtual production curricula. The space that we are currently in, as mentioned, is the Extended Reality Studio here in Ansin 309. This facility, of course, contains as of this past summer, A LED wall, a camera jib containing a red KOMODO-X camera with a Canon server lens, a type red spy camera tracking system. We also have installed a lighting grid containing six DMX controllable lighting fixtures. And these two quads are LED panel lights, traditionally or previously. How does virtual filmmaking work? If you want to place someone in another world and you’ll film them in front of a green screen or blue screen, you would need to chrome a key or composite it in post by taking a color and then bring in the environment once you’re ready to edit the composites. However, with this technology, this cutting edge technology, we are able to instead do that all within the camera. So here on this wall we have this image of a subway, but in reality, it’s not a stack image. It’s in fact a real-time rendered environment done in a game engine such as Unreal Engine. When I’m moving this camera around, this box on the wall known as the Frost Room follows where the camera is looking, and of course, as I’m lifting and lowering this camera, it produces this 3D parallax effect. And that is because this physical camera is being tracked by this type red PY camera tracking system. This little can on top of the jib is emitting infrared light that is hitting these retroreflective markers on the ceiling. That infrared light is being reflected back into the central lens back on the camera tracking system. So this system is looking for patterns or constellations in order to identify where the camera is positioned within the studio. But the system not only tracks position, it also keeps track of various attributes of the camera, such as zoom. So as I’m zooming in, this frum adjusts accordingly, and this represents what the camera sees. And so not only can I affect what’s on the wall, I can also focus in on both the subjects and also on virtual elements such as this hole. And so right now I can pull focus on myself, but I can also pull focus on this virtual elements and treat it as if it is all one environment. So we’re unifying reality and virtual reality to produce this real time illusion that we’re in another world. So as I’m zooming in onto myself, and then let’s roll the focus in just a tiny smidge, you may notice that the bounds of what’s being rendered on the wall is also increasing and decreasing accordingly. Again, all driven by their camera tracking system on top of this rig. So this is what’s known as the rum. So in order to save up on computational power, we only render what’s really needed up on the wall. How do students learn cutting edge workflows at Emerson College? And students of course learn various cutting edge workflows in constructing these virtual sets within game engines such as Unreal Engine. And so the Emerging Media Lab contains both the studio space where they can film within those virtual environments, as well as the lab space where they are provided with a number of Windows workstation computers, all equipped with WM drawing boards where they’re able to conceptualize design, create 3D assets, and create environments that they can then bring into here prototype and film with a variety of different projects and use cases, workflows and technologies such as these ones have been used in productions such as the Mandalorian and Fallout, students have the hands-on opportunities to learn on these technologies by taking classes such as virtual production or by seeking on-campus employment as a lab assistant here within the emerging media lab.

Emerson’s XR Studio is a virtual production facility located in the Emerging Media Lab on the third floor of the Ansin Building. The studio features an LED wall, a real-time tracked camera rig, DMX-controllable lighting, and a dedicated server room powered by disguise media and render servers running Unreal Engine. It supports in-camera visual effects (ICVFX), volumetric video capture, virtual reality development, and real-time AI-driven workflows, giving students access to the same technology used on professional film and television sets worldwide.

Capabilities

  • In-Camera Visual Effects (ICVFX) — Real-time rendered environments displayed on the LED wall replace traditional green screen, providing natural lighting and reflections captured directly in camera.
  • Virtual Production with Unreal Engine — Students build 3D environments in Unreal Engine that are rendered in real time onto the LED wall using disguise media servers, with full camera tracking via stYpe Redspy.
  • Volumetric Video Capture — Using Azure Kinect depth sensors, the studio captures three-dimensional performances that can be placed into virtual environments.
  • AI-Driven Workflows — Real-time Stable Diffusion integration allows generative AI to transform camera feeds live, enabling experimental filmmaking at the intersection of AI and virtual production.
  • Virtual Reality Development — Dedicated VR suites (Ansin 310 and 311) with HTC Vive and Meta Quest headsets support immersive content creation and room-scale experiences.

The Emerging Media Lab

  • Ansin 309 (XR Studio) — LED wall, tracked camera rig, lighting grid, volumetric capture. Can operate as a standalone production space or extend the adjacent lab.
  • Ansin 312 (EML Lab) — 16 Windows workstations with Wacom Cintiq Pro 24″ monitors. Software for 3D authoring, game engines, animation, photogrammetry, and VR content creation.
  • Ansin 310 & 311 (VR Suites) — Individual rooms configured for room-scale PCVR with HTC Vive headsets.
  • Ansin 307 (Control Room) — Houses the disguise Director and Actor media servers, RXII render servers, and the EML-XR-NAS.
  • Ansin 809 (Gaming & Immersive Media Lab / GIML) — Additional classroom lab supporting game design, interactive media, and immersive projects.

Articles

Using the XR Studio

The XR Studio is available for curricular use, faculty-approved projects, and approved extracurricular productions. Virtual production access requires completion of a Virtual Production course and a safety orientation. For general EML access, visit the Post Production reservation portal. For production requests, submit an intake form through the EML Zendesk.