How should the government respond to AI?
With AI technology growing rapidly, new and emerging screenwriters may worry if there will be a place for human screenwriters in the near future. That all depends on what the government is going to do about setting some ethical guidelines for AI usage.
What concerns need to be addressed?
Language models like Chat-GPT are trained to imitate human speech and different styles of writing by generating virtual facsimiles of all types of content. Chat-GPT has been lauded for its capabilities as an idea generator. Professional writer and blogger Christopher Kokoski said on his writing blog that the app can generate outlines and even entire scenes for potential scripts.
It can generate characters, scenes and dialogue that will help you get started with writing a fleshed-out story. All you have to do is give Chat-GPT a prompt, such as a genre and a script structure. This faster turnaround on major parts of a scriptwriting process understandably raises concerns about how much human screenwriters would be owed for work that was co-authored by AI.
Marc Guggenheim, a film and TV writer with several credits, is concerned that AI will adopt the biases of its programmers. The Guardian has quoted him saying, “If it’s going to tell stories from the perspective of its programmers, essentially, then you have to worry about the lack of diversity among the programmers.”
The Federal Government established the National AI Centre in 2021 to develop guardrails for adopting AI in all industries. They’ve joined forces with industry partners to develop the Responsible AI Network, to uplift responsible use of AI in Australia.
Possible solutions to the problems with AI
The Labor government announced in May they would take the first steps toward regulating AI, in the form of a detailed discussion paper and public consultation addressing the benefits and risks of AI systems.
With the United States government already developing a regulatory framework for ethical AI usage, perhaps considering what other countries have suggested can provide a foundation for protecting the employment of human screenwriters.
If you want an example of how they could stop AI from taking human screenwriters’ wages, you could look to the Writers Guild of America. On 27 September, the WGA ended their strike against American production companies with a tentative agreement that said that an AI-written script is not copyrightable unless it is expanded upon or revised by a human.
Copyright would necessitate having to credit the human writer of a script, so the writer would be entitled to compensation for their work. Since AI at the moment is best at providing structures or individual scenes rather than full-length scripts, it would be inevitable that human writers would be involved in some way or another.
As for addressing AI screenwriters’ potential for bias, a review board would be an effective solution. CSIRO proposed review processes that ensure AI programs developed overseas adhere to Australian ethics and legislation as one of the potential solutions in their 2019 discussion paper on an AI ethics framework.
A review board would be a good solution for screenwriters. It has the potential to reflect the diverse voices of all Australian screenwriters while also letting producers and studios ensure that AI is capable of meeting the studios and producers’ ethical guidelines.
At the same time, however, it would be costly to establish a consultation panel on AI when AI is rapidly developing.
Is the government doing enough?
Labor MP Julian Hill vocalised a desire to bring ethicists and philosophers to a consultative body as opposed to simply technical and industry experts. Hill actually used Chat-GPT to co-write a speech discussing the potential risks of AI technology.
A section of the speech written by Chat-GPT accurately addresses the potential for “job loss” and to “perpetuate existing biases and discrimination,” that Hill himself is concerned about. The full speech has been posted to Julian Hill’s YouTube channel. The Chat-GPT authored content appears from 2:22 to 2:40.
Julian Hill MP shares his speech that was co-written by Chat-GPT at. Uploaded to YouTube by Julian Hill MP: Link to channel.
The continued evolution of generative AI’s imitation capabilities would make it hard to distinguish between human contributions and AI contributions to an AI-generated script. Even if a human screenwriter claimed copyright by way of editing, producers could justify buying a script for a lower-than-average upfront fee.
This would especially be a problem for first time screenwriters who don’t have years of experience to justify higher pay or the negotiating expertise to work for a contract that ensures they retain intellectual property rights.
When the government is ready to finalise legislation on AI governance, they should include a board that oversees industry AI management. They can establish ethical and legal boundaries about the usage of AI, and then production companies and writers’ guilds can develop those general rules into industry guidelines. Production companies mustn’t abandon human screenwriters, lest our stories lose their souls.