When Kathleen Kennedy addressed the role of artificial intelligence in filmmaking at the MPSE Golden Reel Awards, she positioned herself firmly in the “human creativity first” camp, calling for clear boundaries on AI’s role in the production process. The Variety interview was widely reported as a straightforward declaration of principle. What was not reported is the subtext: Kennedy is not primarily afraid that AI will produce bad films. She is afraid that AI will expose how much of Hollywood’s economic model depends on gatekeeping labor that AI can now partially replicate.
What the Experts Are Deliberately Not Saying
The expert class in Hollywood — showrunners, producers, studio executives — has developed a careful public vocabulary for discussing AI risk. Terms like “human creativity,” “authentic storytelling,” and “artist protection” dominate the discourse. What these terms tend to obscure is the economic dimension: who benefits financially from the labor that AI threatens to automate, and who is positioned to capture the value that AI will generate as it becomes embedded in production pipelines.
Kennedy’s call for “AI boundaries” comes from the perspective of a studio executive who built her career managing the labor of writers, directors, animators, and composers — and whose authority derives substantially from standing between creative labor and the commercial enterprise that funds it. AI-generated scripts, storyboards, concept art, and music do not simply threaten jobs. They threaten the organizational infrastructure that has historically concentrated creative power (and compensation) in a small number of institutional gatekeepers.
The 2023 Strikes Showed the Real Battle Lines
The 2023 WGA and SAG-AFTRA strikes placed AI at the center of their contract demands, not because writers and actors were philosophically opposed to technology, but because they understood that unregulated AI integration would allow studios to renegotiate labor agreements from a fundamentally different power position. The strikes secured some protections. They did not resolve the underlying dynamic, which is that studios are actively exploring AI applications that reduce their dependence on union labor, while simultaneously issuing public statements about “human creativity” that are designed to manage the optics of that exploration.
Kennedy’s statement fits neatly into this category: principled-sounding, strategically ambiguous, and carefully positioned to avoid any specific commitment. “Human creativity must remain paramount” is a sentence that means nothing without specifying which humans, in which roles, protected by what contractual mechanism.
What This Actually Means
The film industry’s AI conversation is being conducted almost entirely in the register of aesthetics — will AI produce authentic art? — when the actual conflict is economic. Who controls AI-generated content? Who receives residuals from it? Who is displaced and who is insulated? Kennedy’s MPSE intervention is a data point in that conflict, not a contribution to an aesthetic debate. Until Hollywood executives are willing to have the economic conversation explicitly rather than laundering it through appeals to artistic integrity, their AI boundary talk will remain strategically meaningless.
Background
The 2023 WGA strike lasted 148 days, making it the longest writers’ strike in Hollywood history. AI regulation was a central demand, with final agreements establishing disclosure and payment requirements for AI-generated material — though enforcement mechanisms remain contested.