TECH

Apple relies on the safety of communication to cause a surge of blackmail CSAM

Apple security functions can be included by parents for the children's account.

facebook after the reports of the teenage boys died from Suicide since 2021 after he was orlaw blackmail, Apple uses his safety capabilities, which will help protect the potential and MDash; But some critics say that this is not enough.

The number of cases of blackmail of adolescents hunts for scammers who either deceived the victims to provide obvious images and MDash; or who just made fake images using blackmail AI and MDash; Grows. The results are often tragic.

In the new report of The Wall Street Journal, a number of young victims were presented, which ultimately completed their lives, and do not encounter a humiliation of real or false obvious images, which became public. This content is known as the material of sexual violence against children or CSAM.

The profile mentions the implicit trust of adolescents to exchange iPhone messages as part of the problem.

The article WSJ includes the story of Shannon Hikok, the team coach in high school and her 16-year-old son Elilage. After Elijah, having written his mother about events the next day, Elijah went to bed.

hours later, Hikok was awakened by her daughter. Elijah was found in the laundry, expiring bleeding from a selfless firearms inflicted by a dedicated fire. He died the next morning.

He and two other teenagers presented in the article became victims of criminals who are connected with adolescents on social networks and MDash; Often posing as girls -teenage. After a public chat period, in order to get confidence, blackmail sends fake obvious photos of the “girls” that they pose, ask such images from the victim in return.

Then the victims require payments in the form of gift cards, the transfer of wires and even cryptocurrencies so as not to share images publicly.

payments are required immediately, which makes the victim under temporary pressure to pay blackmail and mdash; which they often cannot do & mdash; Or the face of the family, school and public humiliation.

adolescents faced with what they consider impossible alternatives can go as far as suicide to avoid the pressure of blackmail. Fraud is mentioned as a “sextort” and claims or destroyed a lot of young lives.

CSAM detection and current efforts of Apple Apple Apple Apple

Cases of sexual fees have taken off with an increase in social networks. The National Center for the missing and operational children in the United States published a study last year, showing an increase in the number of cases of sexuals by 2.5 times in the period from 2022 to 2023. The group advises parents of adolescents to prevent the opportunity to discuss the possibility of threats to blackmail on the Internet to minimize the effect of the blackmail threats.

When the problem first became widespread in 2021, Apple announced a set of tools that he would use to detect possible CSAM images on minor users.

This included the functions in Imessage, Siri and Search, using a mechanism that scanned Heshi photos ICLOD on the database of well-known CSAM images.

also included a mechanism for users to mark inappropriate messages or images, informing the sender in Apple. Image scanning applies only to iCloud users, and not to images that were stored locally.

After a massive protest from the confidentiality defenders, the security groups of children and governments, Apple abandoned their plans to scan ICLUD photographs from the CSAM database. Instead, he now relies on his functions of communication safety and adult control over children in order to protect his users.

“Scanning of each user stored in a private ICLOUD user will create new threats for threats for data thieves that can be found and operated,” said Eric Neuenshwonder, Apple Director for Children's Confidentiality and Security. “It would also make the potential for slippery inclination of unintentional consequences.”

“For example, scanning for one type of content opens the door for volumetric observation and mdash; And it can create a desire to look for other encrypted messages of messaging in different types of content, ”he continued. Some countries, including the United States, put pressure on Apple to provide such observation.

“We decided not to proceed with the proposal for a hybrid client approach to detecting CSAM for ICLoud photographs several years ago,” Neeenschwander added. “We came to the conclusion that it is almost impossible to realize without ultimately not subjected to non -disposal of the safety and confidentiality of our users.”

Apple now uses communication safety to help protect children.

If the user of the subsidiary receives or intends to send an image with nudity, the child will be provided with useful resources. They will be sure that everything is in order if they do not want to view the image.

The child will also be given the opportunity to inform someone to whom he trusts for help.

Parents will not be immediately notified if the system finds that the message contains nudity due to Apple's fears that a premature notification of parental can pose a risk for the child, including the threat of physical violence or abuse.

However, the child will be warned that the parents will be notified if they decide to open the image in masks that contains nudity, or send the image that contains nudity after warning.

Although some groups called for further measures from Apple and other technological firms, the company is uniquely trying to find a balance between user confidentiality and harm prevention. Apple continues to investigate methods of increasing the safety of communications.

Leave a Reply