Apple Child Safety update will scan photos for abusive material, warn parents

Click Here To Join NG Adverts

Apple has announced a raft of new measures, aimed at keeping children safe on its platform and limiting the spread of child sexual abuse images.

As well as new safety tools in iMessage, Siri and Search, Apple is planning to scan users iCloud uploads for Child Sexual Abuse Material (CSAM). That’s sure to be controversial among privacy advocates, even if the ends can justify the means.

The company is planning on-device scanning of images that will take place before the photo is uploaded to the cloud. It’ll be checked against known ‘image hashes’ that can defect offending content. Apple says this will ensure the privacy of every day users will be protected.

Should the tech discover CSAM images, the iCloud account in question will be frozen and the images will be reported to the National Center for Missing and Exploited Children (NCMEC), which can then be referred to law enforcement agencies.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple writes in an explainer.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

Elsewhere, the new iMessage tools are designed to keep children safe from online exploitation. If a child receives what go-between image-detecting tech deems to be inappropriate it will be blurred and the child will be warned and “presented with helpful resources, and reassured it is okay if they do not want to view this photo.”

Depending on the parental settings, parents will be informed if the kid goes ahead and views the image. “Similar protections are available if a child attempts to send sexually explicit photos,” Apple says. “The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.” Again, on-device image detection tech is used.

Finally, new guidance in Siri and Search will provide iPhone and iPad owners with staying safe online and filing reports with the relevant authorities.

“Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

These updates are coming in iOS/iPadOS 15.

Click Here To Join NG Adverts

Source link
#Apple #Child #Safety #update #scan #photos #abusive #material #warn #parents

Leave your vote

Recommend0 recommendationsPublished in B2B business, Brand reviews, Business pro, Business reviews, Investments, Product reviews, Reviews, Website and app reviews

Related Articles

Responses

Your email address will not be published. Required fields are marked *

Log In

Forgot password?

Don't have an account? Register

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.