Hammerhead is now
Same teams. Same skills
New combined powers.
Polymotion Stage is a partnership between Dimension, MRMC, and Nikon providing the world’s first multi-solution mobile capture studio. The Polymotion Stage is able to capture volumetric video, volumetric images, and avatars. It also also equipped with state-of-the-art motion and tracking, which means you can bring incredible accuracy to prop tracking should your capture require it.
The fixed capture solution at Dimension was launched in 2017 in London and was the first commercial Microsoft Mixed Reality capture studio globally. The mobile solution, ‘Polymotion Stage’, comes in two forms: Stage Dome, and Stage Truck that transforms into a self-contained capture space. The Stage Dome and Stage Truck can provide both 2k and 4k resolutions.
Polymotion Stage Truck and the London fixed studio, have an 8 foot diameter capture volume, while the Polymotion Stage Dome is 10 foot. Most scenes can be accommodated into these capture volumes, where we can comfortably film 1, 2 or 3 people together, or film individually and composite together in scene after. For example, Dimension filmed over 30 actors and reassembled them in a Viking longboat for ‘Virtual Viking - The Ambush’.
Volumetric video capture is a technique where a human performance is filmed from all angles creating a three dimensional (3D) video, allowing the user to view any point of the performance from any angle. Dimension operated volumetric capture stages use Microsoft’s Mixed Reality Capture technology.
Once processed, volumetric video content can be placed into any AR, MR, VR, 360, or 2D video application. Volumetric content is also compatible with broadcast software packages providing extra depth and richness to augmented content in virtual studios.
Green screen captures images of people for inclusion in 2D Media like film or print and is viewable from the captured plane. Volumetrically captured content provides a 3D asset that allows the viewer to see the performance or talent from all angles when viewed on broadcast, or in AR, VR and XR. It also allows creatives and directors free-viewpoint virtual cameras.
As soon as your processed performance has been imported into real-time 3D software, you can position the model at any angle, anywhere in the scene.
All Dimension operated volumetric capture stages have a minimum of 106 synchronised cameras. We have 53 RGB cameras which read and record the colour required for the .png texture map, and 53 IR cameras that record depth and position in space for creating the mesh. In our mobile stages, there are an additional 4 cameras (2 IR and 2 RGB) on the floor shooting upwards to ensure greater detail when capturing movements that require the head to be facing down.
In our fixed capture solution we offer a 2k capture resolution. For our mobile solution we can provide either a 2k or 4k capture resolution.
There are eight rigged microphones inside the rig for recording sound, and lav mics can also be incorporated to capture broadcast quality sound. Directional sound recording is also available.
As standard we estimate for and shoot at 30FPS. We can shoot up to 60fps. We are happy to test 90fps and above if you have a specific requirement.
Our software relies on light and colour to join up the images captured by each camera. Different materials absorb infrared at different levels and we rely on this information to create a detailed mesh. To ensure the quality of our captures, we carry out quick but rigorous tests on the wardrobe prior to shoot to advise of any potential issues and so that shoot day is all about the performance not the technology.
Capture times can range anywhere from seconds to minutes. At 2K resolution, footage is capturing at 10GB per second. We can capture up to an hour of footage to be processed in one day, after which data needs to be transferred onto a local server farm.
We’d prefer less than three to be captured in the stage at once depending on the actions required by the scene. To ensure that the cameras capture all your performer’s details without camera occlusion, our technical team can advise what the best scenario is depending on the creative output required. Scenes that require more people can be composited together after filming the characters separately.
The first example of volumetric capture being used on live broadcast was during Madonna’s groundbreaking performance at the Billboard Music Awards 2019. Madonna performed her track dancing on stage and integrated into the choreography with four volumetrically captured holograms of herself.
Volumetric capture was also utilised by Sky Sports at the 148th Open, to revolutionise their ability to show viewers golfers’ swings during analysis in incredible detail.
We can process content quickly and generally require one to three weeks depending on the length of footage required. For content needed quickly, we have developed a rapid pipeline that can turn around assets from capture to broadcast in 48 hours. This is exactly the process that enables Sky Sports to capture the world’s leading golf talent and share their swing analysis on broadcast days later. Depending on the creative required, our team will work with you to calculate accurate timings for delivery.
Supported formats and systems include Brainstorm, Pixotope, Frontier and Vizrt, plus MP4, WebAR, UE4 and Unity. We can also export for VisualFX and NLE pipelines like Maya, 3DS Max, Cinema 4D, Houdini and Nuke. Please get in touch for more details.
As well as entertainment, sports, games, volumetric video is already being used for advertising and marketing campaigns, such as Michael Bublé’s AR selfie app, museums and the UNESCO Heritage Sites “If these walls could talk”. Volumetric video has also been used to create virtual patients for training medical professionals by Pearsons.
Examples of volumetric content we have created:
Live sports broadcast: Golfer’s swing analysis on Sky Sports’ coverage of The 148th Open
Live performance: Madonna’s Medellín live performance at the 2019 BBMAs
Theme park rides: Ridley Scott’s ‘The Ambush’ at The Viking Planet
Immersive theatre: Jeff Wayne’s The War of the Worlds, ‘All Kinds of Limbo’ at the National Theatre
Interactive Apps: Michael Bublé’s O2 app
Games: Andy Murray’s Champion’s Rally at Wimbledon
Historical installations: ‘If these walls could talk’ at the UNESCO Heritage Site
Medical training: Pearsons’ Holographic Healthcare training
Fashion: Nike’s Air Jordan Future campaign