Child Safety Versus Privacy Rights: What Safe Adults Can Do Now

According to a news report, the Heat Initiative, a child advocacy group, "has raised two million dollars for a new national advertising campaign calling on Apple to detect, report and remove child sexual abuse materials from iCloud."

In 2021, Apple announced its plan to scan iPhones for child sexual abuse materials. However, privacy experts warned that governments could abuse the system. The company eventually abandoned their plan after severe backlash.

By 2023, Apple has been criticized by child safety crusaders and activist investors for not doing more to protect children from online abuse. Heat Initiative is one of the organizations fighting for online child safety.

By the second week of September 2023, the Heat Initiative released digital advertisements on websites popular with policymakers in Washington, such as Politico. They also put up posters across San Francisco and New York that state: "Child sexual abuse material is stored on iCloud. Apple allows it." They timed their campaign ahead of Apple's annual iPhone unveiling which was scheduled for September 12.

Apple has made protecting privacy a central part of its iPhone pitch to consumers. Unfortunately, when the company made that promise, they helped make their services and devices useful tools for sharing child sexual abuse materials.

Investors have also called on Apple to publicly report the number of abusive images that it catches across its devices and services.

According to Matthew Welch, an investment specialist at Degroof Petercam, "Apple seems stuck between privacy and action." Their organization, together with Christian Brothers Investment Services, a Catholic investment firm, decided to submit a shareholder proposal that would require Apple to provide a detailed report on how effective its safety tools were at protecting children.

Mr. Welch said, Apple responded to an email from the Heat Initiative with a letter defending its decision not to scan iCloud.

Erik Neuenschwander, director for user privacy and child safety of Apple, said in the company's letter that they had concluded that "it was not practically possible" to scan iCloud photos without "imperiling the security and privacy of our users." "Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems."

Apple has created a new default feature for all child accounts that intervenes with a warning if they receive or try to send nude images. It was designed to prevent the creation of new child sexual abuse material and limit the risk of predators coercing and blackmailing children for money or nude images. These tools have also been made available to app developers.

Apple has been praised by both privacy and child safety groups for its efforts to blunt the creation of new nude images on iMessage and other services. But, according to Alex Stamos, director of the Stanford Internet Observatory at the Cyber Policy Center who applauded Apple's decision not to scan iPhones, said the company could do more to stop people from sharing problematic images in the cloud.

Governments around the world have also been putting pressure on Apple to take proactive measures to prevent child sexual abuse. For example, the eSafety Commissioner in Australia issued a report criticizing Apple and Microsoft for failing to do more to proactively police their services for CSAM ("child sexual abuse material".

Tripp Mickie "In Monitoring Child Sex Abuse, Apple Is Caught Between Safety and Privacy" https://www.nytimes.com/2023/09/01/technology/child-sex-abuse-imagery-apple-safety-privacy.html (Sep. 01, 2023).

Commentary and Checklist

According to the U.S. Department of Justice, "The market for CSAM among individuals with a sexual interest in children drives the demand for new and more egregious images and videos." Furthermore, because of the demand for new CSAM, the number of abused and exploited child victims as well as the abuse of new children continue to increase.

CSAM is child pornography and is a federal crime. According to the Department of Justice, child pornography "is a form of child sexual exploitation. Federal law defines child pornography as any visual depiction of sexually explicit conduct involving a minor (persons less than 18 years old).  Images of child pornography are also referred to as child sexual abuse images. Federal law prohibits the production, distribution, importation, reception, or possession of any image of child pornography.   A violation of federal child pornography laws is a serious crime, and convicted offenders face fines severe statutory penalties." https://www.justice.gov/criminal-ceos/child-pornography

Apple's promise to protect their clients' privacy has resulted in making their devices and services a "safe place" for individuals who want to keep CSAM that cannot be touched by law enforcement and other authorities.

While companies like Apple continue to work on how to protect children from predators and CSAM, what are some ideas that safe adults can do to help protect children from online sexual exploitation?

 

·      Set boundaries for Internet use

·      Require the child never provide personal identifiers or information of themselves, family, or friends, including name, address, email, phone, school, or employer to anyone online.

·      Keep devices in public areas of the home and monitor online activity.

·      Instruct what is personal information and never share it

·      Require use of privacy settings on social media sites

·      Encourage the child to show you inappropriate email, texts, and social media posts.

·      Remind the child that anything posted on the Internet stays on the Internet forever.

·      Advise a child to report online bullying, threats, or online sexual activity and never respond to it.

·      Eliminate accounts and change phone numbers, if threats are made

·      Instruct the child never to meet someone he or she met online unless a trusted adult is present

·      Approve all images, videos, and blogs before the child posts them; and

·      Look for signs and signals that something is wrong. Keep lines of communication open and encourage the child to share bad online experiences.

Finally, your opinion is important to us. Please complete the opinion survey: