Stand up for the facts!
Our only agenda is to publish the truth so you can be an informed participant in democracy.
We need your help.
I would like to contribute
Old video about Apple update unrelated to new security fix
If Your Time is short
- A video about a planned update Apple announced in August to scan U.S. iPhones for images of child sexual abuse is being shared online after a separate Apple update to fix a security flaw. The updates are unrelated, and Apple said on Sept. 3 that it was delaying plans to scan iPhones for images amid privacy concerns.
"New Apple update," reads the text in a video shared on Instagram on Sept. 15. It shows what looks like an Aug. 6 Associated Press headline that says "Apple to scan U.S. iPhones for images" and text from an Aug. 23 article in the Independent.
"Apple is now going to be able to see all of your photos and here’s what you need to know," a voice in the video says. "The new Apple update means that Apple will be scanning your photos, messages and anything you tell Siri. And if Apple finds anything bad or related to CP a real human will manually review your stuff."
This post was flagged as part of Facebook’s efforts to combat false news and misinformation on its News Feed. (Read more about our partnership with Facebook.)
Let’s untangle what’s going on here.
First of all, the information in the video is unrelated to the emergency software updates that Apple issued on Sept. 13 after security researchers found a flaw that left devices vulnerable to spyware.
Sign up for PolitiFact texts
Second, the AP headline that appears in the video appears to have been truncated.
When we searched for that Aug. 6 headline, we found a similar one with the same bylines: "Apple to scan U.S. iPhones for images of child sexual abuse." The earliest archived version of this story that we found, captured on Aug. 5, shows the same headline.
The AP story reported a plan that Apple had announced to scan U.S. iPhones for images of child sexual abuse. Under the plan, if the tool designed to detect such images found a match, they would then be reviewed by a human. If the human confirmed that the images were child pornography — presumably the "CP" referred to in the video — the user’s account would be disabled and the National Center for Missing and Exploited Children would be notified.
Apple also planned to scan users’ encrypted messages for sexually explicit content as a child safety measure, the AP said. Why? According to the AP, Apple had "been under government pressure for years to allow for increased surveillance of encrypted data" and that "coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users."
On Aug. 23, the Independent reported that Apple had released "yet more details on its new photo-scanning features."
This paragraph from the story appears in the video: "Earlier this month, Apple announced that it would be adding three new features to iOS, all of which are intended to fight against child sexual exploitation and the distribution of abuse imagery. One adds new information to Siri and search, another checks messages sent to children to see if they might contain inappropriate images, and the third compares photos on an iPhone with a database of known child sexual abuse material (CSAM) and alerts Apple if it is found."
Featured Fact-check
As both the AP and Independent reported, the plans were controversial, alarming some privacy experts. And on Sept. 3, nearly two weeks before this video was posted on Instagram, Apple announced that it had "decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Apple initially said that these changes would be part of updates to its operating software, but those changes have been delayed.
They also are separate from the update that was announced and widely reported on Sept. 13.
This post conflates the two by sharing a seemingly old video with old news on the heels of an announcement of a news, separate update.
While there was an update planned that would scan iPhone images, as the video in the Instagram post says, that update has been delayed and is unrelated to a more recent update announced to improve security on Apple devices.
That context is important, and missing in this case.
We rate this post Half True.
Our Sources
Instagram post, Sept. 15, 2021
Independent, Apple gives more detail on new iPhone photo scanning feature as controversy continues, Aug. 23, 2021
Apple, Expanded protections for children, updated Sept. 3, 2021
Associated Press, Apple to scan U.S. iPhones for images of child sexual abuse, Aug. 6, 2021
NPR, Apple is delaying its plan to scan U.S. iPhones for images of child sexual abuse, Sept. 3, 2021
Browse the Truth-O-Meter
More by Ciara O'Rourke
Old video about Apple update unrelated to new security fix
Support independent fact-checking.
Become a member!
In a world of wild talk and fake news, help us stand up for the facts.