Behind the Scenes: Virtual Production Workflow Secrets from Hollywood Studios
To make a film is easy; to make a good film is war. To make a very good film is a miracle.
-Alejandro Gonzalez Inarritu
Behind the Scenes: Virtual Production Workflow Secrets from Hollywood Studios
Virtual production stages have grown explosively—from just three in 2019 to over 300 today. This growth shows how visual content creation has fundamentally changed. Productions like “The Mandalorian” have set new standards for modern filmmaking by adopting this technology, revolutionizing the traditional film production workflow.
The numbers tell an impressive story. The global virtual video production market will grow at a 17.8% compound annual rate between 2022 and 2030. The need for real-time 3D artists now exceeds traditional IT skills by 50%. This technology proves valuable in many formats, from previs in blockbusters like “Godzilla” and “World War Z” to The Weather Channel’s innovative storm demonstrations.
This piece reveals Hollywood’s virtual production secrets, from LED volume stage setups to real-time asset creation. You’ll find proven workflows that leading studios use to create groundbreaking content, whether you’re a filmmaker, technical director, or industry professional seeking virtual production services.
🎥 Learn Filmmaking from Industry Pros
Get access to free filmmaking courses, expert resources, and top training programs designed to take your skills to the next level.
By signing up, you agree to receive emails from FilmLocal. You may also receive relevant offers from trusted partners. Opt-out anytime. Privacy Policy
Unreal Engine Virtual Production: Pre-Production Secrets
The Unreal Engine virtual production workflow differs from traditional filmmaking methods. Directors and technical teams can control their creative vision better while reducing unexpected issues on set. Game engine technology lets filmmakers adjust camera angles, lighting, and special effects as they work, creating immersive environments that blend seamlessly with physical elements.
Hollywood's approach to virtual scouting
Virtual scouting has revolutionized how Hollywood productions explore filming locations. Directors and cinematographers now use virtual reality headsets to walk through digital sets in Unreal Engine. They can check camera positions, assess lighting conditions, and plan complex shots without visiting actual locations, enhancing the virtual cinematography process.
Filmmakers can access specialized tools through Unreal Engine’s Virtual Production Utilities plugin. The custom VR menu interface offers Navigation, Viewfinder, and Bookmark tools. The system works with many VR headsets including HTC Vive, HTC Vive Pro, Oculus Rift, and Oculus Rift S.
Cinematographers on major productions test lens choices and camera movements before they arrive on set. Katherine Harris Mojica, Virtual Production Supervisor at Magnopus explains, “It’s where the director or the DP explore their ideas—if that’s the shot that they really wanted or if it’s even possible to do the shot they really wanted”.
Pitchvis techniques that secure funding
Major studios now rely on pitchvis to secure financing. These high-quality preliminary trailers created in Unreal Engine help during development. Real-time 3D artists create digital assets and virtual sets that show the story’s tone and creative vision.
Studios have found great success with this approach. Blockbusters like “Men in Black 3,” “Godzilla” (2014), and “World War Z” used pitchvis to get production funding. The high-quality assets from game engines add value throughout production while cutting costs.
Pitchvis helps investors see a film’s market appeal clearly. Directors can show specific scenes, visual effects, and storytelling techniques that make their projects financially attractive instead of discussing abstract concepts.
Building the virtual art department: Studio structures
The Virtual Art Department (VAD) connects traditional art departments with visualization teams. Work begins when the script is ready and the production designer starts exploring sets and locations. The team creates everything from 3D environments to detailed props in Unreal Engine, replacing the need for extensive green screens in many cases.
The VAD team should start work 12 weeks before the first virtual production shoot day. They work under the production designer’s guidance and incorporate the cinematographer’s lighting input. The VAD cooperates with:
Production designers and set decorators to ensure virtual assets complement physical ones
Directors and cinematographers for camera placement and movement planning
Physical lighting teams to translate virtual lighting into real-world setups
Set designers and builders by providing orthographic imagery for blueprints
ILM’s Virtual Art Department shows this shared approach in action. They work directly with filmmakers to design environments and sets. ILM states, “The ILM VAD Supervisor works closely with department heads involved in the creative decision making process and also manages the deliveries to departments downstream to ensure that Production’s priorities are met”.
This all-encompassing pre-production approach has changed the old “fix it in post” mindset to “solve it in prep.” Filmmakers can now try out shots and see results quickly, enhancing the interactive filmmaking workflow.
On-Set Virtual Production Technology: LED Volume Setup
LED volume stages are the lifeblood of modern virtual production workflow. They create immersive digital environments that adapt to camera movement. These systems blend wraparound LED walls for filming with immediate rendering to reshape the scene of how filmmakers capture in-camera visual effects.
ILM StageCraft vs Sony Innovation Studios: Volume comparisons
The Mandalorian made ILM’s state-of-the-art StageCraft technology famous. The system features a 75-foot-diameter circular LED soundstage that led to a transformation in production. This massive volume stage creates a 270-degree immersive environment. Alien landscapes and starship interiors render immediately in this space.
Sony Innovation Studios has developed its proprietary Crystal LED technology. The company built major facilities in Culver City next to Amazon’s impressive 80-foot diameter stage with a 26-foot-high volume. Sony showed their dedication to advancing this technology when they acquired Pixomondo and its three LED volumes in 2022.
Each system brings unique advantages. ILM’s StageCraft adapts technology to filmmaker needs instead of forcing directors to change their vision. Sony’s approach enhances their Crystal LED technology to deliver superior image quality and seamless integration with their camera systems.
Real-time rendering optimization techniques
Immediate rendering capabilities power LED virtual production. Project Arena made a vital advancement. It uses Chaos Vantage (a real-time version of V-Ray) with Nvidia’s Deep Learning Super Sampling (DLSS) algorithm version 3.5. This combination achieves fluid 24fps ray-traced rendering, creating photorealistic visuals in real-time.
LED volumes face their biggest problem with realistic parallax effects. Advanced systems now render the entire image in a single pass, unlike traditional methods that separate background and frustum rendering. This approach will give perfectly resolved imagery whatever the frustum size, without pixel stretching. James Blevins, virtual production producer, explains this technique enables “high-resolution, ray-traced virtual environments within an LED volume” even with complex scenes full of light sources.
Smaller productions can use Unreal Engine on NVIDIA RTX-powered workstations for amazing flexibility. Studios like SOKRISPYMEDIA proved that RTX technology enables:
Immediate rendering that speeds up creative pipelines
Creative exploration without rendering delays
Smooth integration between virtual assets and live-action footage
Camera tracking systems - what top DPS prefer
Virtual production camera tracking technology connects physical camera movements with virtual backgrounds to create convincing parallax effects. Top cinematographers prefer two main approaches for their virtual video production needs.
OptiTrack leads the industry for precision LED volume tracking. It provides sub-millimeter accuracy without drift or electromagnetic interference issues. The CinePunk system emerged from collaboration with cinematographers working on major productions like The Mandalorian. It features an ARRI Anti-Twist mount, 14+ hour battery life, and strong construction for demanding environments.
Marker-less tracking systems like Sony’s OCELLUS attract more users because of their flexibility and simple setup. This system uses Visual SLAM technology through a multi-eye image sensor to remove the need for infrared markers or extensive calibrations. Richard Thron, a leading cinematographer, values this creative freedom: “Instead of thinking like someone just gathering data for a pipeline, with virtual production, I can look through the lens and see something I didn’t even realize we could grab”.
Perfect results require synchronized camera settings with LED refresh rates. Visual artifacts like ghosting or jitter can appear when these systems misalign. High-end productions use genlock throughout the entire system and adjust timing “up to the nano second” to achieve perfect synchronization.
Virtual Production Pipeline: Real-Time Asset Creation
Camera-ready virtual assets are the foundations of any successful virtual production pipeline. These assets need specialized workflows that differ by a lot from traditional VFX approaches. The process creates high-fidelity 3D models that render instantly in-camera and eliminate extensive post-production compositing, a significant advancement from traditional CGI methods.
Photogrammetry workflows at major studios
Leading studios have adopted photogrammetry as their main way to capture real-life elements and turn them into digital assets. Clear Angle Studios leads the market in 3D capture for film and high-end TV. They employ multiple specialized rigs for different assets:
Facial capture systems for detailed character creation
Full body photogrammetry with 204-camera arrays for hero cast scanning
Prop and environmental LiDAR scanning for set recreation
As with 3D Systems’ Gentle Giant Studios, their Juggernaut mobile studios have changed on-location asset creation. These units pack 150-200 cameras that produce raw body scans with over 100 million polygons. This tech breakthrough has reduced scan time by 96% and data processing time by 80% compared to traditional workflows.
Proprietary asset libraries and management systems
Studios use sophisticated Digital Asset Management (DAM) systems to handle their expanding virtual asset libraries. These central platforms support the entire virtual production workflow and manage everything from raw scans to finished assets.
High-end DAM systems come with automatic metadata tagging, version control, and role-based access permissions. Das Element helps VFX teams maintain organized element libraries across studio-wide and project-specific collections.
Unreal Engine virtual production needs physically-based materials and dynamic lighting for 3D assets. These assets need proper categorization through consistent file names, folder structures, and searchable tags. Quick retrieval during time-sensitive shoots depends on this organization.
Quality control processes for camera-ready assets
Rigorous quality control procedures mark the final step before assets reach the virtual set. Automated QC tools like Pulsar run detailed checks on media files throughout content creation. The system supports latest technologies including HDR, IMF, and 4K—key standards for modern virtual production.
IBM’s Maximo Visual Inspection takes a different approach and uses computer vision AI to spot defects and ensure asset quality. Technicians can create self-learning models that improve quality control through immediate end-to-end defect detection.
Studios maintain strict standards for asset optimization throughout this process. They ensure efficient rendering while keeping photorealistic quality. Raw photogrammetry scans end up as camera-ready virtual environments that directors and performers can use right away.
More Filmmaking Articles
Unreal Virtual Production Workflow: On-Set Collaboration
Virtual production workflows thrive on collaboration during active shooting days. Technical expertise and creative vision must go together in real-time. The success or failure of a day’s work depends on this delicate balance in major productions, with real-time interaction being key to the process.
Brain Bar Operations: Command center secrets
The “brain bar” stands at the heart of virtual production. This specialized team of artists and engineers operates equipment that powers the smart stage. People also know it as “volume control” or “mission control.” This technical hub manages content distribution, image manipulation, camera tracking, recording, and creative data visualization. The brain bar keeps the LED volume running smoothly while working with departments of all sizes throughout the production.
Key positions within this command center include:
Virtual Production Supervisor – Acts as liaison between real-time crew, art department, physical production team, and post-production
LED Engineer – Operates and maintains the LED walls in a volume
Technical Director – Bridges artistic vision with technical execution
Director-technical director communication protocols
Virtual production demands specialized communication protocols between directors and technical teams. Directors must use “a different part of their brain” when shooting virtual productions. Technology and people need creative and technical coordination for seamless lighting integration and real-time environment adjustments.
Technical directors work as the director’s partners in problem-solving on high-end productions. They position themselves right next to directors and first assistant directors during shoots. This close proximity allows immediate feedback for up-to-the-minute adjustments to virtual environments.
Real-time problem solving during shoots
Teams must diagnose and fix problems instantly when issues arise on virtual production stages. Color mismatches between virtual and physical elements, moiré patterns on LED walls, and tracking synchronization issues create common challenges.
To name just one example, technicians must offset colors immediately when metameric failure occurs (when content looks correct to the eye but not on camera). Teams use real-time camera tracking data to adjust virtual backgrounds automatically based on camera position when parallax problems occur.
Netflix’s virtual production initiative shows how this collaboration expands beyond physical boundaries: “The partnership between Epic and our NLAB unlocked the ability to connect a DP in New York with VFX artists in London, a director in Los Angeles with an art department in Japan”. Modern virtual production workflow makes creative problem-solving possible across continents.
Post-Production Integration in Virtual Film Production
Virtual and traditional post-production elements create unique challenges that need specialized workflows and quality control measures. Virtual production captures many visual effects in-camera, yet post-production plays a vital role in refining, integrating and delivering the final product.
Seamless handoff between virtual and traditional VFX
Post-production handoff needs careful planning and standard protocols. Visual effects were mostly handled during post-production in the past. Virtual production has changed many VFX tasks to earlier stages. Data from virtual production must be properly formatted for downstream teams after principal photography.
MPC studios make use of open-source Universal Scene Description (USD) to optimize this handoff. Rob Tovell, MPC Global Head of Pipeline explains, “We can save USD data directly from the game engine and send those USD files straight to post where our layout team can use that as reference or build directly on that data to start working on the final shot”. This approach helps productions run smoothly for projects of all sizes.
Color management across the pipeline
Color consistency throughout virtual production creates major technical challenges. LED wall filming needs specific color management solutions to match colors imagined in preproduction. ARRI Color Management for Virtual Production offers one solution that allows “precise calibration of LED walls in ICVFX environments, substantially reducing the work needed on set and in post to recover missing color information”.
Cameras, LED displays, real-time rendering engines, and physical lighting create dynamic interactions that often lead to color consistency problems. The ARRI calibration system now supports full Open Color IO and makes “it easier to configure your stage to optimal settings”. Any application supporting OCIO can now apply this calibration, including most 2D playback systems.
Quality control and final delivery standards
Quality control serves as the final checkpoint in virtual production post-processing. Netflix has created standard QC procedures for virtual production that establish “quality control measures by creating a protocol for flagging, communicating, and solving issues”. Their system groups QC errors into three severity levels:
Grade 1: Issues that should be fixed but aren’t critical to pass
Grade 2: Problems that may affect distribution and are advised to be fixed
Grade 3: Critical issues that impact program quality and will not meet distribution requirements
Netflix’s Production QC Glossary, developed with global dailies experts, provides common language to document and communicate quality issues. The glossary uses uniquely coded error classifications organized by department and severity. This helps teams communicate clearly across all production stages.
Virtual production optimizes traditional post-production tasks. Larry Jordan points out, “Being able to create and change digital settings on the fly can really cut costs”. Quality control remains essential to ensure the final product meets all technical and creative standards before delivery.
🎥 Advance Your Filmmaking Career Today
Whether you’re a beginner or a pro, gain access to career-changing film education, exclusive courses, and mentorship opportunities with industry professionals.
By signing up, you agree to receive emails from FilmLocal. You may also receive relevant offers from trusted partners. Opt-out anytime. Privacy Policy
Ready to Do Virtual Production?
Modern filmmaking has undergone a substantial change through virtual production. Advanced technology and simplified processes have replaced traditional filmmaking methods. Today’s major studios create groundbreaking content using sophisticated LED volumes, up-to-the-minute rendering, and dedicated virtual art departments.
Virtual production combines creative vision with technical excellence. Directors can explore digital environments through virtual scouting before filming starts. LED volume stages let teams visualize complex scenes right away. Teams across the globe work together efficiently with real-time asset creation systems.
The brain bar acts as the command center and orchestrates the complex interplay between technical systems and creative teams. Strict quality control ensures consistent results from start to finish. Virtual production has become an essential filmmaking tool. Directors can achieve their creative vision and retain control over costs and schedules.
Virtual production has ended up becoming more than a technological advancement. It creates a new creative space where imagination meets instant visualization. This change continues to shape content creation’s future. Shots that seemed impossible before are now achievable. Creators can expand the possibilities of visual storytelling, pushing the boundaries of what’s possible in film and television production.
While you’re at it, you should check out more of FilmLocal! We have plenty of resources, and cast and crew. Not to mention a ton more useful articles. Create your FilmLocal account today and give your career the boost it deserves!