On The Road To Glory

After eights months of gruelling shoot in South Africa’s Namib Desert, Margaret Sixel sifted through 470 hours of footage to create a 120-minute extravaganza that shocked, awed and …


After eights months of gruelling shoot in South Africa’s Namib Desert, Margaret Sixel sifted through 470 hours of footage to create a 120-minute extravaganza that shocked, awed and surpassed cinematic expectations, and landed her the ‘Best Film Editing’ Award at the 88th Academy Awards

Mad Max: Fury Road’ is the fourth film in veteran director George Miller’s Mad Max franchise and the first Mad Max film in 30 years. Set in a stark desert landscape, the film stars Tom Hardy as Max and Charlize Theron as Furiosa. The two team up to face a relentless enemy as they try to escape the desert wasteland on the Fury Road.

Margaret Sixel with director George miller

The film was well worth the wait—the post apocalyptic thriller is a technical and creative feat, receiving universal acclaim from audiences and critics alike. But the creative journey wasn’t easy. An intense eight-month shoot in the desolate Namib Desert in Southern Africa was followed by more than two years of post production. To cut the film, Miller’s long-time editor Margaret Sixel and her team had to weave together more than 470 hours of location footage into what would eventually become the final 120-minute film.
And not to mention the 2,700 individual shots that are an integral part of this 120-minute movie. One just cannot afford to go wrong with even one frame in any of these shots as it can jar the entire movie experience!
To achieve this monumental task, Sixel and her team relied on creative tools built upon the Avid MediaCentral Platform. By embracing Avid Everywhere, they were able to overcome immense challenges, including harsh production environments, limited Internet bandwidth, and great distances between creative teams to deliver one of the most innovative and exciting films of this generation.

The team used up to 20 cameras at one time
– from Arri Alexas to canon 5ds – that were often
fixed to moving vehicles to capture the action scenes

Staggering production challenges
The majority of ‘Mad Max: Fury Road’ revolves around an intense road battle, with dozens of vehicles, hundreds of extras, and mind-boggling stunts. To create these sequences, the team shot with up to 20 cameras at one time—from Arri Alexas to Canon 5Ds—that were often fixed to moving cars and trucks.
The editorial team on location needed to process 10 to 20 hours of footage on a daily basis—from ingest through transcoding to clearance—before finally sending media to the cutting rooms in Sydney. Furthermore, the more complex stunts had to be processed first to meet the shooting schedule and ensure they had been captured properly.
“All of the footage had to be flown back to Australia, which took at least three days, with essential material uploaded to Sydney daily,” explained Matt Town, Post Production Supervisor. “Unfortunately, the limited and inconsistent Internet bandwidth in Namibia made this extremely difficult. With over 470 hours of film, including 90 minutes of complex stunt work, the complexity of managing and sharing such a massive amount of media was staggering.”

Dealing with on-ground challenges
In order to meet these challenges, the team turned to Media Composer and ISIS shared storage. The system was comprised of nine Avid Media Composer stations running off a 44 TB ISIS system, including three Media Composer | Nitris DX systems to provide editors with highest performance for editing real time effects. Assistant editors used six additional stations to cut the film, with one running remotely in an in-house DI suite to facilitate greater integration with the DI process.
“For the most part the film was shot chronologically, which proved very useful,” Sixel explained. “Throughout the eight-month shoot in Southern Africa, I was in Australia building the cut. It was important to create decent cuts of the film early, so I could give George feedback and discuss pickups.”

All footage had to be flown back to Australia,
which took at least three days

The Media Composer system on location had its own separate ISIS system containing a copy of all the footage, which helped the teams overcome the difficulty of uploading cuts. Avid bins containing sequences were sent via email, which allowed them to turnaround sequences in as little as an hour—from the cutting rooms in Sydney to the desert of Namibia. The Avid system held up well in the two-year post-production process, including the time spent in the heat and dust of the Namib Desert.
“Throughout the entire process, Media Composer and ISIS allowed us to handle a massive amount of footage with ease,” said Town. “The ability to share bins enabled us to spread the load of sorting and arranging the footage across multiple assistants, so it could be easily reviewed. This is the bedrock upon which the whole editorial process stands. If you can’t find footage easily, you simply cannot edit it.”
“Our Avid workflow handled it brilliantly. It’s really the whole Avid package that makes it the only choice for professional editors,” said Sixel.

Working the material to death
Once the dust settled on the Namibia shoot, the real challenge for Sixel and her team began. Sixel spent countless hours experimenting, re-examining material, and searching for overlooked gems to provide Miller with as many creative choices as possible. The team created an enormous number of comps to build hundreds of shots, often combining up to eight layers. For example, they added missing vehicles, flames, muzzle flashes, extra war boys on the War Rig, and different backgrounds.
“I try to be as prepared as possible before Miller steps into the room, and I know what he expects,” said Sixel. “This was not an easy film to cut, and the overall orchestration was more challenging than any one scene. I created multiple versions of each scene and ordered them in terms of my favourites. Some of it looks deceptively simple, but the variables were enormous. Every sequence had many hours put into it, and every scene had to earn its place. No fat. No repetition.”
Once Miller was happy with a scene, Sixel asked the assistant editors to string together every available option for every shot, including numerous comp options. To create the alternate comp versions, the editors varied the timing of elements, used different takes, and experimented with alternate backgrounds. Miller and Sixel would then review all the alternatives and refine the cut further.

The team created an enormous number of comps to
build hundreds of shots, often combining up to eight layers

An intuitive editing process
“It was wonderful having all of this stunning material, but it could be mindboggling at times,” Sixel recalled. That is always a challenge when you have visually incredible footage like the screaming War Boys on The War Rig or other action sequences. At times like these, Sixel had to literally stick to the plot drop some scenes because the viewer might feel jaded with too much action. It meant looking at the same scenes, repeatedly.
“The number of possible variations was daunting, and we constantly returned to the original scene bins. When you first put a scene together, you look at a particular piece of footage one way. But as the film matures, you might look at the same material differently and see a new way to use it. We worked the material to death,” Sixel said.
The team’s relentless work ethic paid off, and ‘Mad Max: Fury Road’ became a blockbuster success. Sixel believed that although the process was very time consuming, it is very satisfying to see the amazing results. She compared the process to creating a mosaic work of art or a musical composition—subtle rearrangements can have a potent effect on the final cut.
Sixel felt that editing the film on Media Composer made the creative process very straightforward and intuitive. “I could sleep easily at night knowing that our Avid systems would keep track of the hundreds of edit decisions we made everyday,” she added.
Ask her the secret about the perfect editing and she revealed that it is not merely a technical exercise. Instead, she said, editing is intellectual, emotional and ultimately, like all artistic endeavours, intuitive. And like Jonas Salk said, the intuitive mind will tell the thinking mind where to look next. The same principle applies to editing too!

Make it real

Elia Petridis explains the power and potential of VR to shake up traditional storytelling Virtual reality appears to hold much potential for filmmakers to reach audiences in an …


Elia Petridis explains the power and potential of VR to shake up traditional storytelling

Virtual reality appears to hold much potential for filmmakers to reach audiences in an immersive way, but so far the medium has been used more as a means to put the user in the shoes of another person in a real life setting – such as a refugee or a person caught up in a warzone – which lends itself more to documentary and news filming.
But what about VR as a medium for fictional storytelling? Hollywood studios are already dipping their feet in the water by releasing VR experiences to promote feature films such as Ridley Scott’s The Martian. And while the future trajectory of VR in film is far from clear, it looks set to become part of the arsenal of filmmakers around the world, whether for short films, advertising-related content or news and documentary footage.
One filmmaker keen to tap the potential of VR is Elia Petridis, who is best known for directing The Man Who Shook the Hand of Vicente Fernandez, which starred US legend Ernest Borgnine in his final role. Petridis, who heads up his own film production company Filmatics, and splits his time between Dubai and Los Angeles, sees VR as a medium with huge potential to engage audiences and bring a new dimension to film.
Indeed, Petridis says that many of the current early exponents of VR are working “in the scope of empathy in VR”, which is all about making the user feel an immense connection with the subject matter, such as putting the viewer in a refugee camp in VR. While this undoubtedly has a strong purpose, Petridis is keen to counter this ‘empathy’ aspect of VR with ‘delight’ – that is producing immersive entertainment and telling stories that require VR as a medium.
As Digital Studio went to press, Petridis was at the Sundance Film Festival to show his latest project, a VR horror film called Eye for an Eye, a collaboration between Wevr, a Los Angeles-based company specialising in VR film, and Filmatics. The project came about after Petridis was approached by Wevr,
which was keen to tell a whole story with a beginning, middle and end in VR, and asked Petridis to pitch an idea.
“I was tasked to tell a story by legitimising the VR as the medium for that story, that it could only have been told in Virtual Reality, not another medium,” Petridis says. “They sprinkled on the little prompt of ‘maybe a horror thing could be good’, so I said ‘OK, cool’.”
Petridis went away and thought about Wevr’s brief, the merits of VR, and horror as a film genre. He wrote a script for a short film that was tailored for VR. “So when it came to VR, I really wanted to think of something that should be VR because the story deserves VR,” he said.
Eye for An Eye follows the story of three teenagers in an LA coastal town who are looking for their lost friend. Without giving away any spoilers, the teenagers end up being reluctant participants in a séance led by a mysterious elderly lady called Henrietta. And the viewer – as one of the participants in the
séance – is also forced to sit at the table for the experience. The film further taps into what Petridis terms the “horror of VR” by featuring eyes as a prominent theme. The ghost who is contacted through the séance lost his eyes in a stabbing attack and is looking for a new pair to replace them – a search that could well have the viewer ripping off their Samsung Gear headset! For Petridis, the theme of eyes also
fitted well with the medium, as when watching a VR film, the viewer is effectively looking through somebody else’s eyes.
Appearing at Sundance Film Festival with Wevr’s library of content as Digital Studio prepared to go to press, Petridis explained.

Canon introduces 4K UHD broadcast lens series

Canon last month introduced its range of 4K UHD 2/3 mount BCTV lenses. Marking the introduction of the UHD-DIGISUPER series of broadcast lenses for 4K cameras employing 2/3-inch …


Canon last month introduced its range of 4K UHD 2/3 mount BCTV lenses. Marking the introduction of the UHD-DIGISUPER series
of broadcast lenses for 4K cameras employing 2/3-inch sensors, the two new lenses support a variety of applications ranging from sports telecasts in expansive stadiums to large-scale events and concert telecasts, enabling the capture of panoramic views of venues as well as high-impact close-up shots.

With the launch of the new UHDDIGISUPER series, Canon will continue to satisfy the advanced needs of a wide range of industry users. 4K UHD Field Box Lenses – The UHD-DIGISUPER 86 and 90 high-magnification, long-focal-length field zoom lenses for 4K broadcast cameras. The 86x-zoom UHD-DIGISUPER 86 delivers a level of resolution that exceeds 4K, while the UHD-DIGISUPER 90 features an impressive 90x zoom ratio.

34 CANON

The 4K Premium UHD-DIGISUPER 86 features optimal lens positioning along with advanced levels of component and body assembly precision, helping to minimize the occurrence of aberrations to less than half of conventional HDTV lenses. In addition to
achieving a level of resolution surpassing that of 4K from the center to the peripheral areas of the image field, the lens employs a built-in 2x extender that allows users to instantly increase the zoom range two-fold—from 9.3– 800 mm to 18.6–1600 mm—simply by pushing a button on the controller.

Even at its maximum focal length of 1600 mm at the telephoto end, the UHD-DIGISUPER 86 makes possible superior imaging
expression with outstanding resolution performance exceeding that of 4K.
The UHD-DIGISUPER 90 realizes exceptional optical performance and features an impressive 90x zoom ratio and 810 mm focal
length at the telephoto end. Even when using the built-in 2x extender, which increases the focal range from 9–810 mm to18–1620 mm, the UHD-DIGISUPER 90 delivers exceptional resolution power supporting use with 4K broadcast cameras, even at its maximum focal length of 1620 mm at the telephoto end.

Both the UHD-DIGISUPER 86 and 90 employ lens coating technologies and internal lens-barrel designs that prevent reflections to minimize the occurrence of ghosting and flaring, enabling the capture of High-Dynamic Range (HDR) video, which has
recently been growing in popularity. Both lenses can also be used to capture HDTV content.

34-1 CANON

Even despite its impressive imaging performance, the UHD-DIGISUPER 86 realizes exceptional usability, measuring approximately
250.6 mm (w) x 255.5 mm (h) x 637.4 mm (l) and weighing only approximately 27.0 kg. The UHD-DIGISUPER 90, despite its high zoom ratio, measures approximately 250.6 mm (w) x 255.5 mm (h) x 610 mm (l) and weighs approximately 23.2 kg, achieving
a body size and weight on a par with conventional HDTV broadcast field zoom lenses.

4K UHD Portable type Zoom Lenses With the expansion of its 4K broadcast portable lens lineup, which includes the CJ20ex7.8B and CJ12ex4.3B, Canon is responding to a diverse range of user needs.
Through optimal lens positioning and advanced levels of component and body assembly precision, both CJ12ex4.3B & CJ20ex7.8B achieves high-quality 4Kresolution images from the center to the peripheral areas of the image field. Delivering superior color reproducibility for outstanding expressive power, the lens contributes to the creation of compelling video brimming with realism.

Both the new CJ20ex7.8B and CJ12ex4.3B includes a built-in 2x extender, which increases the lens’s focal range by two fold while maintaining 4K superior optical performance.

Calling the shots through zlense studio

At the BES Show 2016, zLense, a specialist provider of a standalone real-time depth sensing and modelling platform to the film, broadcast and gaming industries, released ‘zLense Studio’, …


At the BES Show 2016, zLense, a specialist provider of a standalone real-time depth sensing and modelling platform to the film, broadcast and gaming industries, released ‘zLense Studio’, an innovative turnkey solution that enables the creation of 3D effects without the need for any additional special production environment. Using ‘zLense Studio’, TV studios of any size are now able to access high quality, yet affordable, cinemaquality on-air graphics capabilities for just a modest investment.

‘zLense Studio’ handles complex production techniques, applying state-of-the-art visual environments that until now were unattainable without special studio set-ups. The breakthrough solution opens the door even for small and mid-sized broadcasters to take the production values of their studio-based or OB live transmissions to a new level.

“Operating in real 3D, ‘zLense Studio’ generates a Z composite with appropriate rendering engines to enable the creation of
shots that cannot be achieved with traditional, layer keying solutions, “ said Bruno P. Gyorgy, President of zLense. “In addition, workflow becomes much simpler – the consequence of which is cost savings that create a level playing field when it comes to creating eye-catching visual effects that transform the viewing experience.”

26-ZLENSE

“This game-changing development goes beyond simply increasing the resolution of the 2D picture and is working with high
resolution 2k, 4k and beyond,” he continued. “This technology is revolutionary from both a technical and commercial perspective, delivering a much more straightforward workflow – in which virtual reality (VR) and augmented reality (AR) are the same – to make 3D graphics accessible to almost every player in the broadcasting industry.”

“Now even local and community TV stations will be able to serve up augmented reality (AR), interactive simulations and
visualizations that elevate production values and usher in a new era of creativity that will delight audiences.”

The zLense platform is also able to integrate with other rendering engines, if needed, provided they are layer-based and
can support the delivery of tracking and depth (Z) information.

Putting high-quality real-time CGI into the hands of the production team, zLense’s solution required no special studio modifications to enable the augmented reality simulations and visualizations that results infographics visually exciting and totally immersive for viewers.

Needed: An act to regulate kids’ content

Essential for broadcasters, content creators to develop content that balances edutainment Digital Studio New Bureau The right kind of education is more important than Right to Education. Children …


Essential for broadcasters, content creators to develop content that balances edutainment
Digital Studio New Bureau

The right kind of education is more important than Right to Education. Children consist of almost one-third of India’s population, hence it is critical for broadcasters and content creators to develop content that adds value and balances education and entertainment. India, therefore, needs an Act that regulates and monitors content viewed by children, which should also provide parenting guidelines.

This was stated by media and entertainment industry veterans at a session on ‘India needs a kids content act’ during FICCI Frames 2016, which was held at Mumbai. Kids programming in India needs customization and original content, rather than just a replication of international content. An effective Act will ensure that the large children population of India gets the right nature of content and education through entertainment while preserving Indian cultural values and ethos by bringing in the right balance between the global content and Indian context.

Filmmaker Subhash Ghai said for children to grow as the real architects of the nation it was essential that they received extensive and in-depth knowledge about their culture and values. He pointed out that the state of education in India was mediocre and there was immense scope for many great Indian stories to be explored for creating content for kids.

Kids actt

M Srinivasan, Founder, GEAR Education, added that the personality of child develops in the first 10 years and it becomes essential that they were exposed to quality content. Hence, some regulations were needed to ensure that kids received quality exposure. He added that a comprehensive guideline for parents was also needed as the ambience of a child created his/her character.

Rajiv Chilaka, Creator of Chhota Bheem, noted that there were 630 million kids below the age of 16 years in India and kids’ content creators needed to add value to their properties. He suggested that to reach kids in the rural areas, there was a need for a dedicated channel like Doordarshan Kids as these regions may not have access to cable TV.

Mukesh Khanna, Chairman, CFSI, said that non-Indian content was being viewed by children as producers were scared to make products for kids as they were bound to lose money. Therefore, a regulation was needed that could encourage producers to develop original content for kids.

Citing the example of the film Baahubali, Nishith Takia, Producer of Delhi Safari, said that Indian content can be successful. Today children have lost respect for their own culture and the need was to promote kids content which was rich in Indian values. He added that kids must be taught to value their heritage and they should not be ashamed of their roots.

Nina Jaipuria, EVP & GM Sonic & Nickelodeon India, Viacom 18 Media said that when Viacom ventured into the kids segment, 100% content was from overseas but today almost 65% content was Indian. She added that with digitization sharper segmentation has taking place which is helping in reaching more kids as well.

The moderator of the session was Ashish Kulkarni, Chairman of FICCI’s Animation & Gaming Forum.

Journey Of An Epic’s Global Release

Vinita Bhatia learns how Qube Wire issued over 18,000 key delivery messages to release Kabali across 3,500 global cinema screens

ay ‘No’ to piracy; do not download the internet from Kabali’ – this was one of the many jokes spawning across the internet before the movie opened to sold out shows in India and abroad. While several companies in southern India gave employees a day off to watch it, some AirAsia flights even had actor Rajnikanth’s face painted on it. Such is the euphoria that the actor evokes amongst his fans.

With so much riding on the much-anticipated movie, it was important for its makers to ensure that Kabali was released globally in a uniform, scalable, cost-effective and secure manner. Director Pa Ranjith and producer Kalaippuli S Thanu therefore decided to rely on Real Image Media Technlogies’ new innovation, Qube Wire, for the distribution of Digital Cinema Packages (DCP) to issue over 18,000 Key Delivery Messages (KDMs) worldwide in over 3,500 screens across 2,400 sites. This mechanism manages the global theatrical distribution cycle from a single point with multi-party collaboration, mirroring the existing distribution agreements between the production studio, global distributor, regional distributors and other intermediaries.

Impressed by the technology, Kabali’s producer, Thanu, said, “I was amazed at Qube Wire’s ease of use and how well it handled the scale of this movie’s release. Kudos to the Qube team for this pioneering product, which is quite revolutionary.”

SECURE FILM CONTENT DISTRIBUTION

Evolving from another Qube Cinema product called KeySmith, Qube Wire simplifies and expands the original technology to enable producers and distributors to distribute their content globally using a stronger web interface with a simplified front end. To do this, the security at the backend and a global database of screens had to be improved, which was undertaken by a dedicated team for over two years.

Are you wondering what necessitated the creation of Qube Wire? Well, earlier to distribute an Indian movie worldwide, the production company would enter into an agreement with service providers like Real Image Media Technologies, which would then digitally master the movie based on the producer’s instructions, and would assign the rights to distribute it in specific territories within India or various countries to other companies. Later, each of these distribution rights holders would then approach Real Image for keys to be made for their release theatres and Real Image would verify the chain of rights manually and release keys if these requests were valid for the individual theatres. “The biggest disadvantage of this manual system was the time taken for the verification process and the lack of scalability since a highly trustworthy internal team was needed at Real Image to verify and cross-check everything,” stated Senthil Kumar, co-founder of Qube Cinemas.

He further explained that Qube Wire’s patent-pending feature lets content owners pick specific territories where a Distribution Key (DK) works. A company can issue a DK to other companies within the Qube Wire ecosystem with restricted rights to issue DKs only to theatres within certain cities, states or countries, with the added ability to specify exclusions within a territory. Together with multi-level approvals, territorial restrictions allow one company to provide DKs to another company that mirror the contractual distribution terms between them.

“When a content owner issues a DK to another company, they can specify whether they need to approve each booking made by the other; or can ask to simply audit what the other company is doing. This continues to work even if the new company in turn issue a DK to another company. So when a content owner distributes internationally, they get to retain overall control of their movie while ceding the part that requires local knowledge – picking the actual theatres and creating a booking,” Jayendra Panchapakesan, Qube Wire’s co-founder added.

WHO STANDS TO GAIN?
Since this service is cinema-centric, it is aimed at producers and filmmakers. However, it can be used by other content creators who want a secure and auditable distribution platform for their media assets. And the good news is that they can currently use it for free by registering on Qube Wire, where they can manage their company and user information. However, they will soon be charged based on the number of sites for which keys are generated or assets are delivered.

The most notable benefits for companies using Qube Wire is the simplicity if affords in executing a complex set of business rules – covering all critical agreements to enforce rights managements within arbitrary territory definitions, as it delivers a complete audit trace to the rights owner and maintains a comprehensive database of worldwide theatre information. For exhibitors in particular, this affords reliable delivery methods for both content and authorization keys, ensuring theatre uptime and no loss of shows.

Besides exhibitors, Qube Wire also eases matters for the movie’s producer and distributor as they no longer have to worry about the minutiae of distribution logistics. Once the movie is mastered, they can upload it on Qube Wire’s cloud network and assign someone to manage the key. The process has been simplified, digitised and made more secure.

Perhaps on the film’s release day, these stakeholders could even consider going to theatres to enjoy it, without having to worry about discrepancies of its release across various screens – knowing that Qube Wire has got that part covered. Of course, the anxiety about the movie’s performance will continue to haunt them, but that is part of their job!

Barco Escape to debut at Egpyt’s Galaxy Festival City Cinemas

Barco announced a partnership with Egypt’s largest multiplex, Galaxy Festival City Cinemas, to bring the Barco Escape premium theater experience to Cairo – a first in the Middle East. The cinema exhibitor will premiere the three-screen, panoramic Escape format with Star Trek Beyond on 27th July, 2016.

Located at Cairo Festival City, Galaxy Festival City Cinemas is Egypt’s biggest cinema multiplex. “Galaxy Festival City Cinemas aspires to bring audiences the most technologically advanced movie-going experience, which is why they are proud to introduce Barco Escape to the region for the first time,” said Peter Khammas, CEO of CineTech. “Escape is what an engaging, immersive cinema experience really feels like – something you can’t get while watching a film at home.”

More immersive than ever

Fully DCI-compliant, Barco Escape utilizes three Barco digital cinema projectors to display the movie on three screens that span the theater’s front and sidewalls. It offers a panoramic canvas that fully captures the audience’s entire field of vision. Galaxy Festival City Cinemas is so enthusiastic about the concept that it is planning to make the first Egyptian movie for the Barco Escape format.

Since Barco Escape debuted in 2014 with 20th Century Fox’s movie ‘Maze Runner’, Barco has been working with leading studios and filmmakers to create projects that offer audiences an unmatched theatre viewing experience. JJ Abrams’ Bad Robot Productions and Paramount recently revealed that ‘Star Trek Beyond’ will be re-mastered for the format for its summer 2016 debut. Director Scott Waugh (“Act of Valor,” “Need for Speed”) is currently directing the Josh Hartnett-led ‘6 Below,’ the first feature shot entirely for Barco Escape. Additionally, Barco has multi-picture deals with 20th Century Fox, Cross Creek Pictures, Fundamental Films, and Jerry Bruckheimer to create movies for the format.

Besides the set-up at Galaxy, Barco expects to have several more Escape-equipped theater installations completed in time for the Star Trek Beyond release in July 2016. The number of Escape installations will then gradually expand over the coming months in the United States, Europe, Mexico as well as China.