Amazon, Meta And Google “Exploiting” Oz User’s Data To Train AI
Big tech companies including Amazon, Google and Meta have been severely criticised by a Senate select committee inquiry over how they use Australian user data to train their AI models.
Labor senator Tony Sheldon, the inquiry’s chair, took to X to announce the release of the report, saying that the three big companies were “exploiting Australians’ data, culture, and creativity while dodging accountability.”
He singled out each of them. “Amazon refused to disclose how it uses data recorded from Alexa devices, or published on Kindle or Audible, to train its AI, saying it ‘doesn’t disclose specific sources of our training data.’ Translation: they won’t admit to what we already know.”
He then went on to argue against Google’s reasoning for including Oz copyrighted data. “Google defended its use of copyrighted works by saying excluding them would lead to ‘bias or ignorance.’ What about respecting the rights of Aussie creators? A pathetic excuse for exploiting creators’ rights,” said Sheldon.
During the inquiry, Facebook parent company Meta has admitted its millions of Australian users can’t opt out of having data from their public posts being used to train the company’s artificial intelligence models. “Meta admitted it scraped every Aussie Facebook and Instagram profile back to 2007. When asked how you could consent in 2007 to tech that wouldn’t exist for another decade, they said: ‘I can’t speak to what people did or did not know.’ What absolute rubbish,” added Sheldon.
The report found some general-purpose AI models – such as OpenAI’s GPT, Meta’s Llama and Google’s Gemini – should automatically default to a “high risk” category, and be subjected to mandated transparency and accountability requirements.
Some of the key recommendations of the report include that the government introduce legislation to regulate high-risk uses of AI.
The report also recommended that the government require developers of AI products to be transparent about the use of copyrighted works in their training datasets, and that the use of such works appropriately licensed and paid for.
“These tech giants aren’t pioneers; they’re pirates – pillaging our culture, data, and creativity for their gain while leaving Australians empty-handed,” said Sheldon.
The report called for the government to be “managing” the growth of AI infrastructure in Australia to ensure that the growth delivers value for Australians and “is in the national interest”.
However, there were dissenting voices from within the committee as to the scope of reforms required to be undertaken.
Two Coalition members on the committee, senators Linda Reynolds and James McGrath, said that while AI posed a greater threat to Australia’s cybersecurity, national security and democratic institutions than the creative economy, the mechanisms to manage it must be done so “without infringing on the potential opportunities that AI presents in relation to job creation and productivity growth”.
They did not accept the report’s conclusion that all uses of AI by “people at work” should be automatically categorised “high-risk”.