Facebook whistleblower Frances Haugen’s message about Instagram’s impact on teenage girls was unequivocal: Facebook’s studies found that 13% of British teens said Instagram prompted thoughts of suicide, and 17% of teen girls say Instagram makes eating disorders worse.
These statistics, however, are only one part of the bigger picture when it comes to the general safety of teenagers online.
It’s estimated that there are over 500,000 sexual predators active on the internet each day. In 2020, there were over 21.7 million reports of suspected child sexual exploitation made to the National Center for Missing & Exploited Children’s CyberTipline. Online enticement reports — which detail when someone is communicating with a child via the internet with the intent to exploit them — increased by more than 97% from the year before.
Reports of online predators are on the rise, but predatory behavior online is as old as Netscape.
My family got our first PC in 1999. I started on gaming platforms like Neopets and Gaia Online. Soon, I was posting thoughts and communicating with other users on Myspace and Tumblr. As my online world expanded, I encountered old men pretending to be preteens. At one point, I began a “relationship” with a 17-year-old boy when I was just 12 years old. Of course, I didn’t talk about any of this, mostly out of shame. I didn’t know I was being groomed — I had never heard the word used until I started doing gender-based violence work myself.
Grooming is subtle, and for a teen unfamiliar with it, undetectable. An individual grooms to build trust and emotional connection with a child or teen so they can manipulate, exploit and abuse them. This can look like an older teen asking to webcam and slowly prodding a child or teen to do inappropriate things such as spin around for them or change clothes to something “cuter,” or a digital “friend” pressuring someone to engage in cybersex. Predators sometimes pretend to be a young person to obtain personal details such as photos or sexual history; they then weaponize this information for their own pleasure.
I only recently realized that there is CSAM — or child sexual abuse material — of me out there on the internet. Footage of me may still reside on someone’s old cell phone or on a hard drive collecting dust. It could one day be shared onto private Discords or Telegram channels.
My individual experience as a teen girl on the internet is part of what led me to build a nonprofit online background check that allows anyone to see if someone they are speaking with has a history of violence — ideally before the first in-person meeting. We recently made the decision to allow users as young as 13 to access our public records database in the future. While we may never be able to entirely stop children and teens from being exploited online, we can at least arm them with tools and technology to understand whether someone they meet online has a record of bad behavior.
Of course, a background check is only one tool in the safety arsenal — people frequently lie about their names and identities. If a child is being groomed, or an adult is exploiting them, they are often doing so in ways that are anonymous, isolated and secret.
This is why educating young people about avoiding the dangers that lurk online is key. This can involve teaching them to identify early red flags like love bombing, extreme jealousy, pushing boundaries, etc. We can also communicate to young people what a healthy, safe, consensual relationship looks like — with “green flags” as opposed to red ones.
There are various practical skills that we can incorporate into kids’ education as well. Teach them to be selective about what photos they share and whose follow requests they accept and to bring an adult if they meet people they know online in real life.
When the adults in their lives discuss the dangers of online dating and internet communication openly and consistently, children and teens learn how to recognize the risks. This can go a long way toward preventing serious trauma. Conversations about safety online, like sex education, are often left to parents, while parents assume kids are having them at school. It can be difficult to navigate these discussions, especially for parents who don’t always understand online culture, but it is essential that parents seek out resources to educate themselves.
As Haugen pointed out, online platforms also have a responsibility. Trust and safety departments at online platforms are relatively new, and there’s still a lot to learn and improve on.
On most digital platforms, content moderators are understaffed, underpaid and undertrained. Online platforms need to put protection over profit and invest in additional training and support for the mental health of those responsible for keeping their platforms safe. By giving safety teams the tools and time they need to think critically about questionable content, they can execute on their mandate effectively and with care.
Though the internet can create environments that lead to abuse, it can also be a powerful tool in educating young people about early warning signs and the realities of the world, including arming them with access to information about who they’re talking to online.
Reactive measures to combat abuse — from the criminal justice system to platform moderators — are a Band-Aid on a bleeding wound. Preventing sexual abuse before it happens is the best protection we can give our kids. By taking responsibility — whether as platforms, politicians or parents — for the potential harm caused online, we can begin to create a safer world for all of us. TFSBS