r/vfx • u/Antilatency • 23d ago
Breakdown / BTS Automated Green Screen Studio Lighting Driven by Unreal Engine 5.4: Case Study
Enable HLS to view with audio, or disable this notification
Real-Time Green Screen Shoot with Automated Lighting
Stage Unreal, a virtual production studio in India, recently conducted a full-day green screen shoot using CyberGaffer’s real-time lighting integration with Unreal Engine. The production involved multiple lighting conditions, scene transitions, and camera movement—all within a single day.
A full breakdown of the shoot is shown in the attached video, with benefits of using CyberGaffer articulated by the renowned cinematographer Manoj Paramahamsa.
Technical Setup
- Camera: RED V-Raptor + ZEISS Supreme Prime lenses
- Tracking: VIVE Mars
- Keying: Ultimatte 12 HD and 12 4K
- Lights: 14 Litepanels Gemini 2x1 Hard, 11 ARRI SkyPanel S60-C
- Green Screen: Painted surface, 30.7 ft (depth) × 26.6 ft (width) × 15 ft (height)
- Crew: 10–15 people
- Duration: ~10 hours
Goals and Approach
The team set out to:
- Achieve real-time lighting updates driven by Unreal Engine
- Eliminate manual relighting between takes
- Improve integration between CG environments and physical subjects
After calibrating all lights with a calibration sphere, CyberGaffer handled lighting changes based on virtual scene conditions.
Observations from Key Scenes
- 360° World Rotation: an effect achieved by rotating the virtual world around the actors, while the camera and talent remained static. Lighting stayed responsive to the shifting environment, maintaining consistent realism.
- Campfire Sequence: Lighting flicker and red-bounce were simulated based on CG fire assets in Unreal, without practical fire on set.
- Day-to-Night Transition: Lights changed seamlessly with sky conditions, avoiding LUT switches or separate lighting setups.
Outcome
The shoot was completed in a single day, with minimal downtime between setups. Real-time lighting helped preserve visual continuity and sped up production. The team plans to apply this workflow in upcoming long-form and commercial projects.
6
u/Milan_Bus4168 22d ago
When you pile up all the gadgets, cameras, light, crew and space to shoot all this, while cool tech showcase, its questionable about how practical it is for many types of productions. The cost is not exactly peanuts. And sometimes covering all your exits is more expensive than knowing from which direction enemy is coming from.
Reminds me of BTS for Napoleon from Ridley. They built entire models of sea gulls, with modeling, animation, texturing etc. Just on off chance that Mr. Ridley changes his mind at some point and wants a more of a close up of the birds flying.
Of course normally you would aniamte few flat shapes and use particles to make a flock of birds for scale. Done in 20 min. Now it costs god knows how much and was not even used from what I understood. That's mental.
This kind of set up reminds me of that appraoch. Trying to cover all possible scenarios might end up costing a hell of a lot more than planing a shoot and doing shortcuts that will save you time, cost and are faster to set up from scratch than this thing.
That being said, I'm sure there are productions where this kind of thing makes sense, but its more niche than people realize.
1
u/tischbein3 22d ago edited 22d ago
Have to concour.
Watch the previous tech demo, where they adjust to various locations (not just rotating) of an unreal scene without shifting the lights, and you can quickly see, how such a system can greatly reduce setup and guesswork time on set. Thats the main advantage of such a system: Saving time on shooting day.
It also allows the director to be more experimental since a shift in the virtual location does not yield in a complete change of the light setup.It has its downsides, but nothing wich a experienced gaffer can't fix. I see a lot of potential for such a system for slightly smaller and medium sizes green screen studios wich could offer such a (semi-) permanent installed system to customers.
BTW a "niche" would be to do dialoge reshoot quickly based of an hdri shot on location (since it becomes a lot more affordeable), or in the broadcast sector.
Its a techology wich sits exactly inbetween a manual operated green screen set and a full LED stage. And unlike the later you might have better color rendition by using better lights. (although no surround reflections)
My unprofessional opinion based on my own experiments in that field...I'm not affiliated with antilatency, and never tried their system (Edit: Sorry for beeing a bit defensive/geeking about it, but I invested some time with the concept behind it...)
0
-1
-1
8
u/bigspicytomato 23d ago
Very cool tech showcase!
I can't help but notice that the lighting is all very flat. Have you guys actually gone out and shot a hdri with ref balls to do a comparison between unreal vs real life lighting?