Podchaser Logo
Home
Hollywood Filmmaking in 2030 - According to the Studios

Hollywood Filmmaking in 2030 - According to the Studios

Released Friday, 8th September 2023
Good episode? Give it some love!
Hollywood Filmmaking in 2030 - According to the Studios

Hollywood Filmmaking in 2030 - According to the Studios

Hollywood Filmmaking in 2030 - According to the Studios

Hollywood Filmmaking in 2030 - According to the Studios

Friday, 8th September 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Back in 2019, Hollywood Studios told us how they wanna make movies for you in 2030.   What, you didn’t get the memo? No to fear, I’ve got you covered. Next on this episode of 5 THINGS.   Hello and welcome my friends to another episode of 5 THINGS – a series dedicated to answering the 5 burning tech questions that you have about technologies and workflows in the media and technology realm. Plus, tech stuff I dig, and how it's used.  I'm still your host Michael Kammes. In 2019, MovieLabs, a non-for-profit group founded by the 5 major Hollywood Studios released a whitepaper called “The 2030 Vision”. This paper detailed the 10 Principals that each of the studios agree is how they want to usher in the next generation of Hollywood content creation.   What are these 10 principals, why should you care, and who the heck is MovieLabs?   Let’s start there. Paramount Pictures, Sony Pictures Entertainment, Universal Studios, Walt Disney Pictures and Television and Warner Bros. Entertainment are part of a joint research and development effort called MovieLabs. As an independent, non-for-profit organization, Movielabs was tasked with defining the future workflows of the member studios and developing strategies to get there.   This led to their authoring of the “2030 Vision Paper”.   As Movielabs is also charged with evangelizing these 2030 goals, they’ve become the Pied Pipers for the future media workflows of the Hollywood studio system. Now, what most folks don’t realize is that a vast majority of Hollywood is risk averse. Changing existing, predictable (and budgetable) workflows doesn’t happen until a strong case can be made for significant time and cost savings. Without a defined goal to work towards, studio productions would most likely continue on the path they’ve on, and only iterating incrementally. So, defining these goals and getting buy in from each studio meant that there would be a joint effort to realize this 2030 vision. It also gave all the companies who provide technology to the studios a broad roadmap to develop against.  Essentially, everyone sees the blinking neon sign of the 2030 Vision Paper, and everyone is making their way through the fog to get to that sign.   The 2030 Vision Paper is directly influencing the way hundreds of millions of dollars are - and will - be spent.   So, what exactly does this manifesto say?   The 10 core principals of the Movielabs 2030 Vision Paper can be organized into 3 broad categories: a New Cloud Foundation, Security & Access, and Software-defined Workflows.   The first half of these 10 principals relate directly to the cloud. Starting with Principal #1, “All assets are created or ingested straight into the cloud and do not need to be moved.”   This means that any audio and video being captured go directly to the cloud – and stay there. And it’s not just captured content, it’s also supporting files like scripts and production notes. Captured content can either be beamed directly to the cloud or saved locally and THEN immediately sent to the cloud. This media can be a mix of camera or DIT generated proxies and high-resolution camera originals, plus any audio assets.   As of now, an overwhelming majority of productions are saving all content locally. Multiple copies and versions are then made from these local camera originals. Then, selected content is moved to the cloud. The cloud is usually not the first, nor primary repository for storage.   Having everything in the cloud is done for one big reason: if all the assets – from production to post production – are in a place that anyone with permission in the world can access, then we no longer have islands of the same media replicated in multiple places, whether it be on multiple cloud storage pools or sitting on storage at some facility.   We’d save a metric [bleep]-ton of time lost by copying files, waiting for files to be sent to us, and the cost to each facility for storing everything locally.   Of course, having the media in the cloud does present some obvious challenges. How do we edit, color grade, perform high res VFX, or mix audio with content sitting in the cloud?   That’s where Principal #2 – Applications come to the media – comes into play. Most local software applications for professional creatives are built on the concept that the medias assets are local, either on a hard drive connected to your computer, or on some kind of network shared storage. Accessing the cloud for all media introduces increased lag and reduced bandwidth that most applications are just not built for.   Despite this, one the many tech questions answered by the pandemic was “what is the viability of creatively manipulating content when you don’t have the footage locally?”   The answer is that in many cases, it’s totally doable. Companies like LucidLink thrived by enabling cloud storage with media to be used and shared by remote creatives in different locations. LucidLink was the glue that enabled applications to come to the media, and not the other way around. Now, in most cases we still need to use proxies, rather than the high-res files, for most software tools.  Editing, iterating, and viewing high-res material that’s sitting in the cloud is still difficult unless you have a tailored setup. Luckily, we still have 7 years before 2030 – and advancements on this front are constantly evolving. Working with everything while it’s in the cloud also means that’s when it’s time to release your blockbuster, you can simply point to the finished versions in the cloud - no need to upload new versions.   That’s where Principal #3 comes in: “Propagation and distribution of assets is a "publish" function”. This means no more waiting for the changes you’ve made on a local machine to export and then re-upload, then generate new links and metadata, as well as the laundry list of other things that can go wrong during versioning. The cloud will process media faster in most cases, as you can effectively “edit in place”.   The promise is that of time savings, and less room for error. Moving on, Principal #4 gets a bit wordy, but it’s necessary: “Archives are deep libraries with access policies matching speed, availability, and security to the economics of the cloud.” Archives generally house content that is saved for long term preservation or for disaster recovery, and should rarely, if ever, be accessed. This is because archived material is typically saved on storage that is cheaper per Terabyte, but with the caveat that restoring any of that archived content will take time, and thus money.   This philosophy is as true for on-premises storage as it is for cloud storage.  It can simply be cost prohibitive to have massive amounts of content on fast storage when the access to it is infrequent. So, Principal #4 is saying that not everyone uses archives the same way, and if you do need to access that content in the cloud somewhere down the road, your archive should be setup in a way that it makes financial sense for the frequency of access, while also ensuring that secure user permissions are granted to those who understand this balance. Principal #4 also dovetails nicely into a major consideration with archived assets. Ya know, making more money! Sooner or later, an executive is gonna get the great-and-never-thought-of-before idea to re-monetize archived assets. Whether it’s a sequel, reboot, or super-duper special anniversary edition, you’re going to need to access to that archived content. Principal #5: “Preservation of digital assets includes the future means to access and edit them” provisions for this.   Having content in a single location that you’re allowed to see is one thing, but how do you accurately search for and then retrieve what you need is, for now, a challenging problem. Let’s take this one step further: Years or even decades may have passed since the content was archived, and modern software may not be able to read the data properly.  Some camera RAW formats, as the paper points out, are proprietary to each camera manufacturer and need to be DeBayered using their proprietary algorithms. So, do we convert the files to something more universal (for now) or simply archive the software that can debayer the files?   We simply don’t have a way to 100% futureproof every bit of content.   What I’m getting at here is that the content has to be fully accessible and actionable - all in the cloud.   OK, Great, stuff is in the cloud, but how do I make sure that only the right people have access to it? Slow down, take a breath, as Security and Access is one of the 3 areas of attention in the 2003 Vision paper. Starting with Principal #6: “Every individual on a project is identified and verified, and their access permissions are efficiently and consistently managed.”   Peeling the onion layers back, this will require the industry to devise and implement a mechanism to identify and validate every person who has access to an asset on a production, which we’ll call a "Production User ID".   This would be for all creatives, executives, and anyone else involved in the production.  This “Production User ID” would allow that verified person to access or edit specific assets in the cloud for that production. This also paves the way for permissions and rights that are based on the timing of the project.   As the 2030 Vision Paper calls out, “A colorist may not need access to VFX assets but does need final composited frames; a dubbing artist may need two weeks' access to the final English master and the script but does not need access to the final audio stems; and so on.”   As Hollywood frequently employs freelancers – and executives change jobs - this also allows for users to have their access revoked when their time on the production is over. In theory, tying this to a “Production User ID” means all permissions for any Production you work on are administered to your single ID. You wouldn’t have to remember even more usernames and passwords, which should make all of you support and I.T. folks watching a bit happier.   As you can imagine, accessing the content doesn’t mean it’s 100% protected while you’re using it. The 2030 Vision Principal #7 states that: “All media creation happens in a highly secure environment that adapts rapidly to changing threats.” This is where I hope a biometric approach to security takes off sooner rather than later, as the addition of new security layers on top of aging precautionary measures are often incredibly frustrating. Multiple verification systems also increase the chance of systems not working properly with another, which is a headache for everyone…and your IT folk.   Movielabs suggests "Security by design;' which means designing systems where security is a foundational component of system design – not a bolt on after the fact. There is also the expectation that safeguards will be deployed to handle predictive threat detection, and that hopefully this will negate the need for 3rd party security audits, which is massive headache and time suck. The security model also assumes that any user at any time could be compromised. Yes, this includes you, executives. You are not above the law. This philosophy is also known as “Zero Trust”, and if you had my parents, you’d totally understand the concept of “zero trust”. The last Principal in the “Security and Access” section is that “Individual media elements are referenced, accessed, tracked, and interrelated using a universal linking system”.   If you’ve done any kind of work with proxies, or managed multiple versions of a file, then you’ve dealt with the nightmare that is relinking files. Changes in naming conventions, timecode, audio channels, and even metadata can cause your creative application du jour to throw “Media Offline” gang signs.     And this isn’t just for media files. Supporting documents like scripts, project files, or sidecar metadata files can frequently have multiple versions.   This means that in addition to needing a “Production User ID” per person, we also need a unique way to identify every single media asset on a production; and every user with permission needs to be able to relink to that asset inside the application they’re using. That’s a pretty tall order. The upside is that you shouldn’t have to send files to other creatives – you simply send a link to the media, or even just to the project file, which already links to the media. The goal is also to have this universal, relinking identifier functioning across multiple cloud providers. The creative application would then work in the background with the various cloud providers to use the most optimized version of the media for where and what the creative is doing. Oh yeah, all of this should be completely transparent to the user.   The last 2 principals are centered around using software to define our production and post production workflows. The first, Principal #9, is that “Media workflows are non-destructive and dynamically created using common interfaces, underlying data formats and metadata.” This Principal covers two huge areas. The first is a common “ontology” - that is, a unified set of terminology and metadata, as well as common industry API. Consider this: you’ve been hired to work on The Fast and Furious 27, and there is, unpredictably, a flashback sequence.   You need access to all media from the 26 previous movies to find clips of Vin Diesel, err Dom, sitting in a specific car at night, smiling. Have the 26 previous movies been indexed second by second, so you can search for that exact circumstance? How do you filter search results to see if he’s sitting in a car or standing next to it? How do you filter out if the car is in pristine shape or riddled with bullet holes after a high speed shootout with a fighter jet? As an industry, don’t have “connected ontologies” meaning, we can’t even agree that the term for something, say, in a production sense is the exact same thing in a CGI sense. If we can’t even agree on unified terms for things, how we can label them so you can then search for them while working on the obvious Oscar bait Fast and Furious 27? The second ask is a common industry API or “Application Programming Interface” that all creative software applications use.    This allows the software to be built in such a way so they can be slotted into modular workflows. This is meant to combat compatibility issues between legacy tools and emerging workflows…and reduce downtime due to siloed tech solutions. While each modular software solution can be specialized, there will also be a minimum set of data, metadata, and format support that other software modules in the production workflow can understand. This also means that because of this base level of compatibility, all creative functions will be non-destructive. Wait, you may be thinking…how the world is this accomplished? All changes made during the production and post production process will be saved as metadata. This metadata can be used against the original camera files. This provides not only the ultimate in media fidelity, but also the ability to peel back the metadata layers at any time to iterate.   We wrap up the 10 Principals with “Workflows are designed around real-time iteration and feedback.”   I get this, right. No one likes to wait. When and if Principal #10 is realized, it means no more waiting for rendering, whether it be for on-set live visual effects, game engines, or even CGI in post. The processing in the cloud will negate the need to wait for renders. Feedback and iteration can be done in a significantly shorter amount of time. Creative decisions onset can be made in the moment rather than based on renders weeks or months later. The 2030 Vision Paper also suggest this would limit post-production time – as well as budget - as more of each would be shifted towards visualizations in pre-photography.   Yes…ish?   There’s a ton of moving parts – with both business and technology partners – that Movielabs annually checks the industry's progress towards the 2030 vision; and Movielabs provides a gap analysis current state of the industry and the work that remains. Without question, our industry has made advancements in areas like camera to cloud capture, VFX turnovers, real-time rendering, and creative collaboration tool that are all cloud native. The Hollywood Professional Association – or HPA – Tech Retreat this year, which has become the de facto standard for must attend industry conferences, showcased many Hollywood projects, and technology partners, like AWS, Skywalker, Disney Marvel.   The 10 principals need a foundation to build upon, and that means getting this cloud thing handled first. And that’s just what we saw.   Principals 1, 2, and 3 were the predominant Principals that large productions attempted to tackle first, with a smattering of “Software Defined Workflow” progress.   However, these case studies also reflected where we do need to make more progress, as we are already 4 years into this 2030 vision.  These gaps include: Interoperability gaps, where tasks, teams, and organizations still rely on manual handoffs, that can lead to potential errors and inefficiencies. Custom point-to-point implementations dominate, making integrations complex. Open, interoperable formats and data models are lacking. Standard interfaces for workflow control and automation are absent. Metadata maintenance is inconsistent, and common metadata exchange is missing.   We also have gaps in operational support, where workflows are complex and often span multiple organizations, systems, or platforms. There's a gap between understanding cloud-based IT infrastructures and media workflows. All files are not created equal and media needs specialized expertise over traditional IT. Support models need to match the industry's unique requirements, considering its global and round-the-clock nature.   Change Management: We do run into fundamental problems with a move of this scale. This technology is new, constantly evolving, and it’s only being spearheaded by the studios and supporting technology partners. This means few creatives have actually tried and deployed new '2030 workflows.' Plus, managing this change involves educating stakeholders, involving creatives earlier, plus demonstrating the 2030 vision value and measuring its benefits.   I’m sure you have some input on one or more of these 5 THINGS.  Let me know in the comments section.  Also, please subscribe and share this tech goodness with the rest of your techie friends.   5 THINGS is also available as a podcast, so search for it on your podcast platform du jour. Look for the red logo!   Until the next episode: learn more, do more – thanks for watching.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features