Pedophilia Runs Rampant on Every Platform
It seems as though every few months there’s another breaking story about a social media influencer or celebrity caught preying on minors. The most recent example is the case of Tony Lopez, a massively popular creator on TikTok. In January of 2021, the news broke that a lawsuit was filed by two teen girls, alleging that Lopez communicated with and asked them for nude photos over social media and text. And this isn’t the first time. Back in August 2020, Lopez hosted a livestream on Instagram, saying he would “hold himself accountable” for similar allegations made against him by several girls, who all noted that they were underage at the time.
But the issue of pedophilia on social media doesn’t pertain to just attractive young influencers. There are even more instances of older men preying on children. One example is Buddy Haynes, who went by “thebudday” on TikTok. Though originally reported on in 2018, his example still stands. On TikTok, there is the option to “duet” with someone else’s video. When you duet with another tiktok, you are essentially showing both videos simultaneously, either reacting to or adding onto the other. Considering that the largest part of TikTok’s user base in the U.S. is 10-19, it isn’t hard to find an account from a child or a young teen. Thebudday would duet with TikToks from young girls, and make sexually suggestive reactions. He finally deleted his account in November 2018, after enough people recognized the predatory nature of his videos and notified local law enforcement and the FBI, who launched an investigation.
This issue with pedophiles on social media isn’t limited to TikTok, but characterizes many of the most commonly used social media platforms. HuffPost published an article in March of last year that covered how easy it is for pedophiles to find and compile seemingly innocent videos of young children for their personal sick enjoyment. On YouTube, a user can add videos to a playlist without notifying the channel who originally uploaded it. Especially in channels featuring children published by their parents or other family members, the owner of the channel would have no idea that their child is a part of a playlist fetishizing minors.
Another thing to note with YouTube, and many social media sites in general, is the use of the algorithm. Generally, algorithms on platforms such as YouTube, TikTok or Instagram show you videos or posts that are similarly linked. For example, if YouTube notices you watch a lot of cute cat videos, they will continue to suggest those to you. Instagram chooses what’s on your Explore page based on accounts you follow and interact with, as well as accounts the people you follow interact with. So, if you were to follow a lot of accounts based on art, you will be shown more art accounts on your Explore page. Makes sense, right? But think about how this can be used in the world of pedophiles. They see a video of a young child in a compromising position. They comment on it, add it to a playlist, and then as the YouTube algorithm sees the user interact with the video, it will then recommend similar ones. Social media algorithms want to continue showing you things that they think you will like, and so the rabbit hole continues on and on. To clarify, the issue is not specifically rooted in the content, but in the algorithm’s constant suggestions for more. A harmless video of a minor is recommended to the wrong person, and they’ve found their pedophilic niche.
But pedophiles don’t only watch content made by or containing minors, they can also easily reach out to them. It isn’t always as public as Haynes duetting tiktoks, or an account commenting on a YouTube video. Privacy settings on social media are a big issue when it comes to blocking pedophiles from directly reaching out to unsuspecting children, who don’t always understand who they are talking to. As YouTuber The Right Opinion, real name James Darcy, notes in his 2018 video about thebudday, “the default settings [on TikTok] appear to be completely public, and they don’t provide any real tutorial or disclaimer that could change it.” For a child making their TikTok account, they would have to go into the settings, and then further into the privacy section in order to notice that their account is set to public. This changes who can duet or react to your video, and who can send you messages. TikTok changed the rules of direct messaging in 2020; according to TikTok’s help page section on direct messages, “As part of our commitment to improving safety on TikTok, [o]nly those aged 16 and older will be able to send and receive Direct Messages.” However, this does not stop the ability for others to comment on or duet with children’s videos.
This addition to TikTok’s rules on direct messaging begs the question of what social media companies are doing about the issue. In an interview with Buzzfeed News, Haynes stated, “They [FBI] said, ‘Look we’ve got complaints’” and “‘Maybe it’s best if you take a month or two off.’” Here is where the problem lies with getting pedophiles off of social media; it isn’t done easily. In the case of Haynes, he deleted his TikTok account himself, and TikTok didn’t take any action in order to facilitate that. It was the complaints and rallying of other users on the platform.
This situation isn’t unique to TikTok either. The HuffPost did an experiment where they went in search of videos featuring minors that could be attractive to pedophiles. What they found was that despite YouTube’s claim to HuffPost that it has, “disabled comments and limited recommendations on hundreds of millions of videos containing minors in risky situations,” it was easy to find videos of partially clothed children. Many also enable comments where pedophiles could comment the timestamp of a particularly graphic part of the video. This indicates that their machine learning and artificial intelligence systems are somehow unable to catch a large amount of exactly what they should be working against. In fact, HuffPost notes in their article that the playlists they found containing compromising videos of children were only taken down after they had taken note of them in an email sent to YouTube.
It seems that social media companies rely on their users to actively report inappropriate content, rather than using the systems they have in place. Social media platforms are huge, and they have huge user bases, and while that can assist in reporting malicious content, it shouldn’t be the main way of stopping it. A huge user base means that these companies have power; they have money and resources to improve their machine learning and artificial intelligence systems for purposes such as this. Not only that, but there is no reason as to why the algorithm itself can’t be changed so that it isn’t abused by pedophiles. These social media companies need to actively work with those who are petitioning and suggesting laws to stop this issue. In 2019, Senator Josh Hawley suggested a bill that would ban recommending videos that feature children, which “would apply only to videos that primarily feature minors, not videos that simply have minors in the background.” While it may seem aggressive to completely ban videos featuring minors, YouTube could work with Hawley to tweak the bill to their comfort. As of now, it doesn’t seem that this has been successful. Currently, there is a petition on change.org with nearly 130,000 signatures calling for the end of exploitation of children on social media, but no real change in policy has been reported.
Megan Bostaph is a third-year English major who just wants platforms to hold their algorithm accountable. You can reach them at [email protected].