As Apple Basks In New Product Afterglow, Parents Demand Change
A survey that found one in three children either interacted with a stranger with harmful intentions, or were exposed to nude images or pornography through a downloaded app, has driven parents and caregivers to demand Apple boss Tim Cook instigate change.
As Apple basks in the afterglow of its much anticipated Cupertino, California, launch today of new products including the iPhone 16 and Apple Watch 10, the survey has led to a petition from thousands of people who are fed up with kids unwittingly gaining access to unsuitable material, or being approached by strangers.
In a petition by US advocacy groups Parents Together and Heat Initiative, more than 7,800 parents, caregivers and other individuals urged Cook to “implement independent, third-party review and verification of app age ratings in the Apple App Store”, per the New York Post.
And although the story originated in the US, it is one replicated around the globe, as those with responsibility for children grapple with ways to try and protect them from corporate greed and people with nefarious intentions. App stores know few boundaries, so it doesn’t matter where you are.
The survey, which you can read here, “found that parents are very concerned that devices like tablets and smartphones pose a risk to children due to potential exposure to inappropriate content and unsafe interactions with peers, adults, and strangers online – and they shared the negative experiences their own children have had”.
“Parents overwhelmingly want to see smartphone and tablet manufacturers and social media platforms invest more to protect children from harmful content, applications, and interactions on their devices.”
The poll surveyed 1,007 parents with kids in grades kindergarten through 12th grade with access to a smartphone or tablet. It was conducted by Bellwether Research from August 17-23, 2024 through text and phone questionnaires.
It found that one in three parents or caregivers reported that their child “has at least one negative experience through the app store on their device: 1-in-3 either interacted with a stranger with harmful intentions or were exposed to nude images or pornography through a downloaded app”.
“Nearly 1-in-3 received an ad for an adult game or app. A significant number (28%) downloaded apps not approved for their age. Nearly 4-in-10 either spent excessive money on in-app purchases or fell victim to scams.”
One respondent of a child with an Apple iPhone said: “Some apps simply aren’t what they claim to be. Both I and my children have downloaded apps that we thought were one thing that they were simply disguised, and they ended up being pornographic, adult themed, or spam.” (Parent of a child with an Apple iPhone)
The National Center of Sexual Exploitation, a US non-profit, has been pushing Apple to step up its efforts to remove dangerous or exploitative materials, and said recently it had a victory when four “nudifying” apps were removed by Apple from its store.
The apps – one of which was rated age 4+ – were being used to bully children, by using AI to make genuine photos or videos pornographic.
“The undress websites operate as businesses, often running in the shadows – proactively providing very few details about who owns them or how they operate,” said Wired in August. “Websites run by the same people often look similar and use nearly identical terms and conditions. Some offer more than a dozen different languages, demonstrating the worldwide nature of the problem. Some Telegram channels linked to the websites have tens of thousands of members each.”
In April Apple published a statement: “In Australia, you can report unlawful or harmful content that you find on any of Apple’s products or services, or any breach of online safety codes.
“Unlawful or harmful material includes illegal and restricted online content, and other content that violates Apple’s terms of use.
“Illegal and restricted online content refers to online content that includes the most seriously harmful content, such as images and videos showing the sexual abuse of children or acts of terrorism, as well as content that should not be accessed by children, such as simulated sexual activity, detailed nudity or high impact violence.
“Other types of content that you can report include:
-
Cyberbullying targeted at a child or cyber abuse targeted at an adult
-
Non-consensual intimate images
-
Content that promotes, incites, instructs in or depicts abhorrent violent conduct
“Under the Online Safety Act in Australia, you can report unlawful or harmful content, including illegal and restricted online content that you find on any of Apple’s products or services, or any breach of online safety codes.
“If the unlawful or harmful online content appears on a third-party app, content service or unsolicited marketing communication, report it directly to the third-party provider.
“To report unlawful or harmful content that you find on any of Apple’s products or services – including illegal and restricted online content or any breach of online safety industry codes – to Apple: Call 133 622 / Email [email protected]
“When you submit a report of unlawful or harmful content to Apple, do not submit the actual content (including any images). Your report should provide only a description of the content.
“You can report illegal or harmful content, including an unresolved complaint that you made to a third-party provider, to the eSafety Commissioner online: www.esafety.gov.au/report.