Apple promotes privacy through a video advertisement. Photo illustration by Josh Barde, original image, Privacy on iPhone / Apple/ youtube.com/apple / CCS-BY-SA (Josh Barde)
Apple promotes privacy through a video advertisement. Photo illustration by Josh Barde, original image, Privacy on iPhone / Apple/ youtube.com/apple / CCS-BY-SA

Josh Barde

Apple gets to the core of privacy rights

* Trigger Warning: The following material contains discussions of child exploitation and sexual abuse

January 18, 2022

Every 10 seconds, a report of child abuse is made. Apple has now recently introduced new child safety features to stop the spread of Child Sexual Abuse Material (CSAM) in hopes of creating an environment safe and accessible to children around the world. 

Later this year, the new CSAM system will affect all of Apple users. The intended goal of the system is to help protect children from online predators, limit the spread of CSAM, while still creating an environment that helps inspire and enhance their user’s lives. 

The Apple policy has sparked controversy within the community, as many have expressed the fear of this policy evolving. 

However, for some Carlmont students, they do not see the issue with the new policy.

“I can definitely see why people would be worried about privacy, but I think so far Apple has a pretty good track record for protecting its user’s information, so personally, I am not too worried,” said Andrew Ghazouli, a senior.

There are three significant changes that Apple is introducing for children’s safety. A first safety measure is a communication tool that will allow parents to have more of a role in navigating their children’s communication online. When parents or guardians opt in to turn on this feature on their Family Sharing accounts, a child receiving or sending a sexually explicit image will be presented with a blurred image and helpful resources. For children ages 12 or younger, their parents will receive a notification only if the child proceeds to send or view the image after being warned. 

This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time

— Apple

Apple is also using new cryptography to limit the expansion of online CSAM for their second safety measure. One concern of Apple customers was that Apple would be downloading known CSAM photos onto the user’s phone to compare. However, Apple is instead comparing unreadable strings of numbers that are known CSAM images on their own servers. These image hashes are validated by at least two child safety organizations and then the new cryptography allows Apple to match these hashes to known CSAM images on iCloud Photos accounts. This system is not designed for images that contain child nudity, instead only known CSAM images validated by child safety organizations. 

The final new measure is the expansion of guidance on Siri and Search by providing additional resources to help both children and parents stay safe online. 

These new safety measures regarding the expansion of online oversight by Apple to limit child abuse material online sparks fear for others as the government is not checking Apple’s actions. 

“All of this data and tracking of information being entirely unchecked by the government and run at the discretion of big businesses, I worry that we are being taken advantage of and we don’t even know it,” said Alexis Romanowsky, a senior. 

Apple has marketed itself differently from other large tech companies like Windows and Facebook by focusing on a commitment to not violate their user’s privacy. Due to this new policy, Apple’s users question if Apple’s false promise is causing the company to fall into the standard model of disregard for their customer’s privacy.

Apple and its reputation

Apple rolls out the new iPhone 13 Pro on Sep. 24. (Josh Barde)

Apple sets themselves apart from the competition by constantly campaigning and advertising their system as the most secure and private on the market. During most launches and keynotes, the underlying message has been privacy

With Apple’s widely publicized history of being the trailblazer in privacy by rolling out new systems which hide your email and information from companies whose sole purpose is to sell people’s information, the level of confidence is hard to alter. Despite this storied history, many customers and consumers of Apple’s products are wondering if the CSAM system is violating their privacy.  

“Your Apple ID and all Apple services are designed from the ground up to protect your privacy. We work hard to collect only the data we need to make your experience better. When we do collect data, we believe it’s important for you to know what we’re collecting and why we need it, so you can make informed choices,” Apple said in a statement

Although there is a wide variety of opinions on this new policy, Ghazouli expressed how many users will not see any change in their device. 

“To me, it doesn’t really seem like your average person will notice any change when they are on their Apple device. It seems like the only people who should be worried are the ones who have done something wrong,” Ghazouli said.  

Apple has taken a lot of time to consider its effect on consumers and has been explicitly open about how the system will work and the probability of an image being flagged by the CSAM system

The importance of the system is to stop the spread of child exploitation and increase child safety on the constantly growing internet world also supports the need for procedures like CSAM.

Companies have struggled with finding the right way to help the internet become a better place while making sure the consumer’s private information is kept entirely confidential. 

Child sexual abuse and exploitation

Child sexual abuse, any sexual activity by an adult, adolescence, or another child, is a hapless and common reality in today’s society. Child sexual abuse does not restrict to one demographic but is universal to various races, cultures, and socioeconomic backgrounds, affecting millions of children worldwide. 

There are two main types of child sexual abuse, touching and non-touching. Touching sexual abuse includes touching a child’s genitals, making a child touch someone else’s genitals, etc. Non-touching abuse includes showing pornography to a child, prostitution or trafficking a child, exposing a person’s genitals to a child, or photographing a child in sexual activities. 

For both types of child sexual abuse, the most common predators are known to the victim, including family members, friends, coaches, etc. In a 2003 National Institute of Justice report, it has been found that around 75% of victims that were sexually abused knew their abuser. 

There are endless effects on child sexual abuse, affecting each individual differently. Shelley Bustamante, the leader of Students Offering Support at Carlmont, went into detail on the impact of child sexual abuse.

“The psychological effects include but are not limited to PTSD, depression, dissociative disorder, schizophrenia, personality disorder, and suicidal ideation,” Bustamante said. 

Although all children are at risk for child sexual abuse, females, children with low self-esteem, children from single-parent households, and children in third world countries are more at risk for child sexual abuse. 

With sexual abuse, clinical experience and research have shown that time alone does not heal traumatized children. Although each individual is different, their psychological trauma can diminish their self-esteem, personality, and quality of life if not treated properly. 

Impact on protecting children

With the expansion of technology in today’s society, children are at a higher risk of being in contact with sexual predators and being sexually exploited because of the little regulations large social platforms have. 

Apple is taking a step forward in solving these issues and allowing children to stay safe online. With Apple having a large consumer base and impact on society, hopefully, this will lead to more companies changing their policies to help protect children, which would have a positive effect on child abuse and the spread of CSAM.

“I think other companies will definitely start to consider implementing similar policies because I feel like once Apple does it and everyone gets used to it, it’ll be a lot easier for them to do it as well,” Ghazouli said. 

Apple is looking to make a safer future for the next generations of children and with this new policy, Apple is providing a safe environment and ecosystem for children and parents.

About the Writers
Photo of Niamh Marren
Niamh Marren, Staff Writer
Niamh Marren, a current senior, is in her second year of Journalism at Carlmont High School. She is excited to write for the student body and inform others and herself of the ever-changing world around her. She plays softball for both her school and her club team during her free time and enjoys spending time with her friends and family. To check out her portfolio, click here.

Twitter: @marren_niamh
Photo of Josh Barde
Josh Barde, Production Editor
Josh Barde is a senior at Carlmont High School. He plays for his school's soccer and lacrosse team, so he knows quite a bit about sports and is passionate about them. He also enjoys photography and creating videos for ScotCenter.

Twitter: @joshbarde

Portfolio: Josh Barde Photography

Scot Scoop News • Copyright 2024 • FLEX WordPress Theme by SNOLog in

Comments (0)

We invite comments and responses to our content. Comments that are deemed appropriate and relevant will be published.
All Sort: Newest

Your email address will not be published. Required fields are marked *