When Discord was launched in 2015, it became a hub for online gamers. The app became more popular during the pandemic. The approximately 150 million users of the platform would chat about a variety different topics using Discord.
Unfortunately, some people have created hidden communities and chat rooms where adults groom children before abducting them. Others trade child pornography and extort minors whom they trick into sending nude photos.
NBC News found 35 cases over the past six years in which adults were prosecuted on charges of kidnapping, grooming, or sexual assault that allegedly involved communications on Discord. Of this total, 22 took place during or after the COVID pandemic and at least 15 of the prosecutions have resulted in guilty pleas or verdicts. The other cases are still pending.
In one case, a teen was taken across state lines, raped, and found locked in a backyard shed. She was groomed on Discord for months.
In another case, a 22-year-old man kidnapped a 12-year-old after meeting her in a video game and grooming her on Discord.
In 2020, a 29-year-old man advised a 12-year-old girl he met on Discord on how to kill her parents. He told her he would pick her up after she killed them and she could be his "slave." Prosecutors said the girl attempted to burn down her house.
The same man encouraged a 17-year-old to cut herself and send him sexually explicit photos and videos. The man admitted to sexually exploiting the minors. He pled guilty and was sentenced to 27 years in prison.
NBC News identified an additional 165 cases that include four crime rings wherein adults were prosecuted for transmitting or receiving child porn via Discord or for allegedly using the platform to extort children into sending sexually graphic images of themselves.
There are other tech platforms used by sex offenders for online child exploitation. However, according to experts, Discord has become an attractive location to these criminals because of its "decentralized structure and multimedia communication tools, along with its recent growth in popularity."
The National Center for Missing & Exploited Children (NCMEC) reports that CSAM on Discord increased by 474 percent from 2021 to 2022.
The NCMEC also said the platform's responsiveness to complaints has slowed from an average around three days in 2021, to nearly five days in 2022.Other tip lines say that Discord's responsiveness can be unreliable.
Discord claims it had disabled 37,102 accounts for child safety violations in the last quarter of 2022.
According to John Redgrave, Discord's vice president of trust and safety, the company was "not proactive at all when I first started." He said, the platform has since implemented several systems to proactively detect known child sexual abuse material and analyze user behavior.
According to Stephen Sauer, director of the tip line at the Canadian Centre for Child Protection (C3P), predators will sometimes connect with children on other platforms, like Minecraft or Roblox, and move them to Discord so that they can have direct, private communication.
Experts also found that child predators who use dark web forums share tips about how to effectively deceive children on Discord. One user wrote in a chat in one of these forms, "Try discord play the role of a generic edgy 15- year-old and join servers. I got 400 videos and 1000+ pictures."
Prosecutors have identified organized roles in these child sex abuse rings, including "hunters" who locate young girls and invite them into a Discord server. The "talkers" are responsible for chatting with the girls and enticing them. The "loopers" stream previously recorded sexual content and pose as minors to encourage the real children to engage in sexual activity.
In several servers, groups explicitly solicited minors to join "not safe for work" communities. Others indicate that they only accept people between the ages 13-17. Others say they accept "little girls 5-17 only."
According to John Shehan, senior vice president of NCMEC, his organization frequently receives reports from other tech platforms mentioning users and traffic from Discord, which he says is a sign that the platform has become a hub for illicit activity.
Although Discord markets itself to kids and teens for school clubs on their homepage, many of the other users on the platform are adults and the two age groups are allowed to freely mix.
Discord has been transparent about its lack of oversight of the activities that take place on the platform. They say that they mostly wait for community members to flag issues. When users flag an issue that is when they investigate and take action. Ben Goggin "Child predators are using Discord, a popular app among teens, for sextortion and abductions" https://www.nbcnews.com/tech/social-media/discord-child-safety-social-platform-challenges-rcna89769 (Jun. 21, 2023).