Hammerhead is now
Same teams. Same skills
New combined powers.
Dimension has teamed up with DNEG, Unreal Engine, ARRI, Mo-Sys, 80six, ROE, Brompton Technologies and Malcolm Ryan to explore using LED stages and real-time engines for virtual production. This test, featuring live actors and creative, builds on the work Dimension is already doing in virtual production using Unreal Engine.
This technology was previously used in Disney’s The Mandalorian, where location shoots were entirely eliminated.
While the workflow for using real-time engines in virtual production is still evolving, the technology is already helping to unlock productions at a time when it is difficult to get shoots underway.
The cinematic, photo-realistic, live 3D worlds created by Dimension are ideal for virtual production. Creating a virtual set in real-time helps virtual production from pre-viz to principal photography. Scenes running in Unreal include realistic environments based on real-life locations, recreated using our photogrammetry pipeline. Using real-time scenes means less of a requirement for on location filming.
Paul Franklin, DNEG Co-Founder and Creative Director
“The ability to control all the aspects of reality within a set that in every other respect can look completely convincing and photo-realistic. I think this is going to be a bit of a revolution in film-making,” said Paul Franklin, DNEG Co-Founder and Creative Director
LED screens run realistic, complex scenes created in the Unreal real-time game engine. Building vast sets in real-time lets us adjust lighting and add CGI or animation; effects which are time consuming and risky on a live set. Multiple sets, environments, landscapes can be captured on a single stage. Creative decisions can be made and actioned in real-time.
The LED stage features screens provided by 80six including a 2mm thick screen along the back and a 3mm lightweight carbon product along the ceiling. Side screens for adjustable lighting and reflection are a 5mm outdoor product, brighter than other screens. Mo-Sys VP Pro real-time tracking works with Unreal Engine and ARRI cameras to help render the perspective that should be displayed on the LED screens.
Actors are filmed performing in the space created by the surrounding screens. This technology means there is no need for a green screen that has to be replaced in post-production and no rear projection.
Teams are able to join remotely from anywhere in the world, viewing feeds from multiple cameras. In a world where production has been heavily affected, these new ways for the industry to work are driving both efficiency and brand new possibilities.
The results of this creative test will be shared in the coming weeks, where we also explore the transmedia possibilities offered by combining virtual production and volumetric capture.