Limit explicit photos | iOS 15.2 Beta Adds iMessage “Communication Safety” for Kids

0


[ad_1]

Despite announcing a delay in its plans for several new child safety initiatives, it looks like Apple is moving forward with at least one of the new features, but with at least one significant change from the plan. ‘origin.

Today, the company released the second beta of iOS 15.2 for developers, and while the release notes have yet to be updated, the latest beta actually includes the new security feature for developers. communications in messages that was found in the code of the first iOS 15.2 beta.

To be clear, there is no evidence that Apple plans to roll out the much more controversial CSAM detection feature that it announced in August. This latest beta only includes the more benign communications security feature, designed to give parents the ability to filter out inappropriate content from their children’s Messages apps.

Unfortunately, Apple has chosen to announce the two new initiatives at the same time, which has led to many misunderstandings about how the two features work. Many have confused CSAM detection, which was designed to scan photos uploaded to iCloud Photos for known child sexual abuse material, along with communication security, a family registration feature that would help filter out sexually explicit material.

While CSAM Detection notifies Apple if photos depicting child sexual abuse are uploaded to iCloud Photos, it only makes direct comparisons of a user’s photos with a known and verified database of CSAM images. photo of your little one is not going to trigger it. Also, it has nothing to do with messages.

However, CSAM detection is still completely excluded. While Apple hasn’t completely canceled the plan, there’s no evidence it will happen anytime soon.

Communication security, on the other hand, applies to messages and uses machine learning to identify sexually explicit photos. Not just CSAM, in this case, but any photos that may show things like nudity or inappropriate body parts. However, this only applies to messages sent and received by children under the age of 18 who are part of a Family Sharing group – and parents must explicitly register to activate the feature.

How communications security works

Once activated, Communication Safety will analyze all photos sent or received by a child or adolescent to determine if they contain sexually explicit content. If so, the photo becomes blurry with a warning telling the user that they probably shouldn’t be viewing or sending it.

  1. Family members aged 13-17 can bypass this and view or send the photo anyway. In this case, they should simply note the warning and then choose to ignore it. No notification will be sent to parents for teens in this case.
  2. As initially proposed, the communications security function behaved differently for children under the age of 13. In this case, the same initial process would be followed, except that if the child chooses to view the photo despite the warning, they will receive a second warning to inform them that a notification their parents will be notified if (and only if) they choose to continue.

The actual implementation in iOS 15.2, however, makes a significant change to this. Due to suggestions that the notification could create problems for children at risk of parental abuse, Apple has removed the notification feature from the Communications Security feature.

Instead, sexually explicit material sent or received by children under the age of 13 will be treated the same as it is for teens – they will be notified of the photos with advice on how they can get help. ” a trusted adult if they receive photos that make them feel uncomfortable.

Under no circumstances will the Communications Security feature send notifications to Apple or anyone else. All analyzes take place entirely on the iPhone. This is strictly a way to protect young children from sending or receiving pictures that they really shouldn’t be.

Moreover, the search for sexually explicit photos is done entirely on the device and none of the data ever leaves the user’s iPhone.

Naturally, when Apple announced communications security and CSAM detection together, it led many to assume the worst – that Apple would use machine learning to flag any photo on their iPhone that looked even a little suspicious and report it to people. law enforcement.

Even Apple’s chief software officer Craig Federighi frankly admitted that Apple got it wrong in the way it communicated these features.

I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion. It is really clear that a lot of the messages got mixed up pretty badly. I believe the little phrase that came out early was, “Oh my god, Apple is scanning my phone for pictures.” This is not what is happening.

Craig Federighi, Senior Vice President of Software Engineering at Apple

With the controversy and negative publicity, Apple said it was delaying plans to “take more time over the next few months to gather feedback and make improvements.” In its statement at the time, Apple referred to “features designed to help protect children from predators who use communication tools to recruit and exploit them,” which clearly appears to be a reference to child safety. communications, and “Limiting the Spread of Child Sexual Abuse. Material,” which is obviously the CSAM detection function.

This suggested that the two were delayed at the time. However, from Apple’s initial announcement, it seemed like they had never been planned for iOS 15.0 in the first place. In fact, iOS 15.2 seems like the point where they probably would have landed anyway.

Based on the communications security changes, it’s clear that Apple has listened to comments from at least some advocacy groups, who have clearly expressed concerns that parental notifications could end up creating an environment dangerous for young children. While the feature as originally offered made it clear that a parental notification would be sent and gave the youngster the option to avoid this, Apple made it clear that it was best not to take the risk of potentially triggering a parent. abusive.

Ultimately, it’s still not certain that this feature will arrive in the final version of iOS 15.2. It’s not at all uncommon for Apple to add features in beta versions of iOS that get phased out ahead of public release, so it’s possible the company is just testing the waters here to see what the reaction is.

[ad_2]

Share.

Leave A Reply