Over the past few years, Apple has emphasized that our private information should remain securely under our control, whether that means messages, photos, or other data. Parents of children under 13 can additionally choose to get an alert if their child proceeds to send or receive “sensitive” images.Īpple will also update Siri and Search to recognize unsafe situations, provide contextual information, and intervene if users search for CSAM-related topics.Īs is always the case with privacy and Apple, these changes are complicated and nuanced. If enabled, kids receive a notification warning of the nature of the image, and they have to tap or click to see or send the image. It requires Family Sharing and applies to children under 18. The second gives parents the option to enable on-device machine-learning-based analysis of all incoming and outgoing images in Messages to identify those that appear sexual in nature. The first relates to preventing the transmission and possession of photos depicting the sexual abuse of minor children, formally known by the term Child Sexual Abuse Material (CSAM) and more commonly called “child pornography.” (Since children cannot consent, pornography is an inappropriate term to apply, except in certain legal contexts.) Before photos are synced to iCloud Photos from an iPhone or iPad, Apple will compare them against a local cryptographically obscured database of known CSAM. Two privacy changes that Apple intends to reduce harm to children will roll out to iCloud Photos and Messages in the iOS 15, iPadOS 15, and macOS 12 Monterey releases in the United States.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |