Generative AI’s evolving role in content production has significant implications for the creative industries. Its potential to create text, image and video indistinguishable from that made by a human, and data scraping practices that make much of this possible, have sparked conversation in the industry over recent months.
At the same time, the UK government identified the creative industries as one of five priority sectors in its 2023 Spring Budget, and its Creative Industries Sector Vision describes the generation of intellectual property as “the engine behind the sector’s sustained growth”.
In August, the government backtracked on a proposal made by the Intellectual Property Office last year, which would have allowed AI developers free use of copyrighted content to train AI models. A report from the culture, media and sport committee said the initial proposal showed a “clear lack of understanding of the needs of the UK’s creative industries”.
Co-operative music festival and conference Beyond the Music, held in Manchester last month, hosted a session entitled Rage With the Machines: Taking Charge Of The AI Evolution to discuss the challenges and opportunities posed by this technology within the music industry.
Speaking on the panel was British Phonographic Industry’s Sophie Jones, who emphasised two key areas of consideration when it comes to AI: “the two words for me are about intent and control.” Jones warned of human-made content being “taken for granted by people who are a long way away from the music industry, and just see music as an expendable commodity”.
Related: Platform co-op looks for a better way forward on AI
Rachel Lyske, CEO of artist-led ‘meta-composition’ software DAACI said that, like any technology, AI can be “either here to enhance and empower us, or to replace.”
Lyske said that, as well as taking over mundane and time consuming processes, AI can be used to do things that would be otherwise impossible to achieve, such as writing every player of a video game their own personalised score. “But the only way I can do that is with an empowering assistant tool to help me do what I do more of,” she added.
Lyske highlighted the need for creators to consider which parts of AI technology they want to bring into their workflow. However, this depends on workers maintaining a considerable degree of autonomy in the workplace, at a time when many are seeing opportunities shrink as technology advances.
A survey conducted among members of performing arts and entertainment trade union Equity found that 65% of performers saw AI as a threat to employment opportunities, rising to 93% for audio artists.
Liam Budd, Equity’s official for audio and new media, urged people to be mindful that “this is a profession delivered by professionals”. He added that though the opportunities presented by AI are exciting, “we also want to make sure that any musician, performer, dancer, whoever is working as a creator, is actually engaged under ethical terms and conditions on fair contracts that are negotiated by their trade union with the producers.”
Key takeaways from the Beyond the Music session included a need for education to ensure that all creators are capable and comfortable using news tools, and a focus on the ethical use of AI through copyright law and workers’ rights.
At the level of individual co-ops, some are proceeding with caution when it comes to the acceptance of generative AI.
Creative Coop is a workers’ co-op that designs brands, websites and digital tools, established by creatives seeking an alternative to traditional agency or freelance work, where they could have more say over their output and who they chose to work with.
So far, Creative Coop has experimented with using generative AI in a number of business areas, such as producing first drafts of company policies, which then had to be manually adjusted.
Related: How are co-ops dealing with AI and the retail revolution?
It has also experimented with using it as a code assistant, with mixed results, says technical director Alan Peart.
“While it wrote interesting snippets of code, and often came up with unexpected ideas for tackling a problem, none of the code it wrote actually worked, and when queried on this or asked to adjust, it would write entirely different code that also didn’t work,” he tells Co-op News.
“Maybe this will all improve over time but for now we would struggle to find an area of our business where we’re comfortable using AI – both from an ethical standpoint and because the quality of work isn’t good enough.”
Rowena Leanne, creative lead at the co-op, is also mindful of AI’s limitations.
“AI will always need a human to oversee production. AI machine learning systems are flawed just as the systems they learn from are flawed. Social issues we encounter in everyday life can be replicated and misrepresented in all forms of creativity – from portraying harmful stereotypes of people created via image generation tools, through to copy generated from popular yet uncited sources.”
For this reason, Leanne suggests that AI tools should be treated more as “the intern in need of guidance from its more knowledgeable peers” than infallible or perfect systems.
Like Lyske, Leanne draws a distinction between two different types of AI on offer to creatives: “timesaving and supplantation”.
“The former offers workers the opportunity to increase productivity of individual tasks within their job role,” she says.
“The latter aims to replace whole job roles and shorten the general production process. One is aimed at adding value to the individual worker, the other is about removing it. It’s the latter we should be cautious of in the first instance.”
This distinction is also useful when looking at AI’s role in the journalism industry, where an already collapsing business model is leading to shrinking newsrooms and the rise of quickly produced ‘churnalism’ over longer-term investigative work.
While AI offers significant opportunities for journalists to save time in their work, it also presents an option for the publication of completely AI-generated news, cutting out the human reporter entirely and increasing the risk of bias and inaccuracy.
Eliz Mizon of local media co-op Bristol Cable describes this as the “logical next step” for an industry that is becoming increasingly reliant on cost-cutting measures, with AI likely to take over from humans currently producing ‘churnalism’.
However, writes Mizon in the Cable, AI will find it more difficult to replace the kind of public interest and investigative journalism they produce, as this work requires empathy and a reliance on human relationships.
Mizon questions whether a robot with the ability to produce an article based on the transcript of a council meeting would ever be capable of gaining the trust of council whistleblowers to reveal secrets that were never minuted in the first place.
“Again, this is a business model problem,” says Mizon, encouraging readers to become members of the Cable in order to support an alternative model based on collective ownership.
What both the multistakholder holder model of the Bristol Cable and the worker co-op model of Creative Coop are offering is an element of control. Members of the Cable are able to steer the direction of the publication through democratic participation in the co-op, meaning they would be able to directly scrutinise any widespread adoption of AI technology the outlet chose to use.
Similarly, Peart says that if Creative Coop was going to start to incorporate AI officially into its workflow, that “would definitely trigger a co-op discussion.”
Leanne echoes this, as well as pointing to their wider community of socially-conscious clients as a possible source of support if the co-op decided that they wanted to stick with more “handcrafted” human-made processes.
Returning to Sophie Jones’s focus on intent and control, the co-operative model’s values and principles offer a route by which consumers and creators can collectively weigh up the pros and cons of any new technology, generative AI simply being the latest example.