Tech Companies Want Copyright Law Relaxed To Aid AI Adoption
Tech companies want Australia’s copyright law eased to give generative AI systems access to online information for free.
In its submission to the federal government’s “Supporting Safe and Responsible AI” inquiry, the Tech Council of Australia says Australia’s voice “risks being left out of a cornerstone new technology of the 21st century” unless the law is relaxed.
It’s a controversial stand given that authors, musicians and photographers want firms to pay for training their generative AI systems using their copyrighted works.
In July, The Australian Society of Authors told Books + Publishing that the large-scale scraping and exploitation of works without regard to authors and illustrators rights would be “outrageously unfair” and that technology companies may be profiting from “intellectual and creative labour of creators without transparency”, it reported.
In the US, 8000 authors have signed a letter demanding payments and multiple class actions for payments are under way. Getty Images is suing AI art generator creator Stability AI for breaches of copyright, The Verge reports.
But Meta’s president of global affairs, former UK deputy prime minister Nick Clegg, argues that training generative AI systems using copyrighted materials is lawful under “fair use” provisions of copyright law.
Japan meanwhile will not enforce copyright on data used to train AI models.
Some news organisations are seeking payments that will allow generative AI models to train using their news stories. In July Associated Press announced it was licensing OpenAI to access its archive of news stories while News Corporation says it is negotiating with big tech firms to forge a similar deal.
Others, such as the New York Times, CNN and Australia’s ABC are reported to have blocked OpenAI’s GPTBot web crawler from accessing content. The Guardian reports that The Canberra Times and The Newcastle Herald also have blocked web crawling.
Tech Council members include Google, Adobe, Accenture, Salesforce, Microsoft and IBM along with a swag of Australian firms including Atlassian, Canva, Telstra, Optus, Carsales and Foxtel Group.
The council’s submission says continuing with current copyright law would disadvantage Australia.
“We note that Australia’s current copyright regime is amongst the most stringent in the world … and is outdated in many regards,” the submission says.
“It is also an area that lacks clarity around its application to AI, particularly generative AI, and machine learning systems.
“This may limit our potential to design, develop, and train AI/ML models locally, while also potentially posing barriers to inward AI investment and adoption.
“It may also pose barriers to the adoption of Australian culture, language and values being embedded in generative AI systems – risking Australia’s voice being left out of a cornerstone new technology of the 21st century.”
The submission calls for “more detailed consideration and consultation” around how our copyright framework … should apply to AI/ML systems and whether reforms are required”.
Technology research firm Telsyte, which is researching the impact of AI on Australian work and life, says content owners should be compensated if their data is used to train models.
“AI training systems should compensate content owners by utilising a small fraction of their AI infrastructure budgets,” says managing director Foad Fadagi, at this year’s Tech Leaders’ Conference in Pokolbin.
“They should treat it as part of the cost of training a model up to a certain standard.”
Mr Fadaghi said it didn’t make sense for content owners to limit their reach and their discoverability by opting out, particular if their content didn’t sit behind a paywall.
“However content owners such as artists might be concerned that AI could copy their individual style and techniques and distribute works to their financial detriment.”
Mr Fadaghi says it will take wider action for content owners to change the approach of big tech towards these payments.
The Tech Council submission draws a distinction between content used to train an AI model versus content that forms part of the outputs of a generative AI model.
The submission also argues against Australia adopting a specific AI Act as is occurring in Europe. It says Australia overall should retain its existing legal regime.
“Diverting from this model, which has worked well for decades, would risk producing laws and rules that are quickly outdated, that hinder regulators’ ability to respond to dynamic or domain specific developments … and that are confusing for businesses and consumers accustomed to the current regulatory model and regulators.
“Extra layers of technology-focused regulation on top of our existing technology-neutral laws could drive capability and investment offshore, particularly if combined with even more restrictive laws in key policy areas like copyright/IP.”
The submission backs a government target of 1.2 million tech jobs by 2030, upskilling the Australian workforce, increasing investment in AI research and AI start-ups, and increasing digital literacy and responsible AI awareness.
Chris Griffith is attending the Tech Leaders’ conference at Pokolbin courtesy of MediaConnect.