Apple in the iOS 15.2 beta introduced a feature that’s designed to keep children safer online by protecting them from potentially harmful images. The feature will automatically detect nudity in all messages, blur them, and a warning will pop up.
Apple stated that children would be provided with resources, and they would be reassured that it is alright not to view the image if they do not want to. If the child tries to send pictures that contain nudity, a similar process will happen. Whatever the case is, the child will be given the option to contact anyone that they trust for help. The feature is opt-in, and it can only be accessed via the beta version of the iOS. This means that it is not currently rolled out to everyone.
Not For Adults
As a Family Sharing feature designed exclusively for Apple ID accounts owned by a person under the age of 18, there is no option to activate Communication Safety on a device owned by an adult. Adults do not need to be concerned about Messages Communication Safety unless they are parents managing it for their children. In a Family Sharing group consisting of adults, there will be no Communication Safety option, and no scanning of the photos in Messages is being done on an adult’s device.
For Communication Safety, images sent and received in the Messages app are scanned for nudity using Apple’s machine learning and AI technology. Scanning is done entirely on device, and no content from Messages is sent to Apple’s servers or anywhere else. If a child is warned about a nude photo and views it anyway, parents will not be notified, and full autonomy is in the hands of the child. Apple removed the feature after criticism from advocacy groups that worried it could be a problem in situations of parental abuse.
With this feature, the Cupertino giant aims to detect child sexual abuse and trafficking in iCloud photos. But, the launch of this feature was delayed as Apple said it first address the complaints filed by privacy advocates.