Discord Transparency Reports (original) (raw)

Child Safety

Discord has a zero-tolerance policy for anyone who endangers or sexualizes children. Child-harm content is appalling, unacceptable, and has no place on Discord or in society.

We work with industry peers, civil society, and law enforcement to ensure that this effort extends beyond Discord. We invest heavily in resources and education so parents know how our platform works and understand the controls that can contribute to creating a positive and safe experience on Discord for their children.

This year we launched our Family Center which empowers parents and guardians to stay informed about their teens Discord activity, as well as our Teen Safety Assist, which is a series of features including safety alerts and sensitive content filters which are enabled by default for teens. We also launched our Parent Hub which is a resource on how to help teens stay safer on Discord.

We are an active supporter of the National Center for Missing and Exploited Children (NCMEC) and their efforts to ensure that grooming and endangerment cases are quickly escalated to law enforcement. Users who upload abuse material of children to Discord or who engage in high-harm activity towards children are reported to NCMEC and removed from the service.

Discord is a member of the Technology Coalition, a group of companies working together to end online child sexual exploitation and abuse. We have partnered with the technology non-profit Thorn to enable us to build the right protections to empower teens with the tools and resources they need to have a safe online experience.

We are a proud sponsor of the National Parent Teacher Association, ConnectSafely, and partner with The Digital Wellness Lab to integrate their research on teen health and social media. Additionally, we are members of the Family Online Safety Institute, contributing to and learning from its important work.

Discord partners with INHOPE, the global network combatting online CSAM, and have become a member of the Internet Watch Foundation whose resources we will use to expand our ability to identify and prevent child sexual abuse imagery. We are a sponsor of events dedicated to increasing awareness of and action on child safety issues such as the annual Dallas Crimes Against Children Conference.

You can read our Child Safety policies, developed with the latest research, best practices, and expertise in mind, here.

Discord disabled 116,219 accounts and removed 29,128 servers for Child Safety during the fourth quarter of 2023. We removed servers for Child Safety concerns proactively 96% of the time, and CSAM servers 97% of the time. We reported 55,955 accounts to NCMEC through our use of PhotoDNA and hashing systems such as our visual safety platform, PDQ, and CLIP, which you can read more about here.

We have a longstanding team who solely focuses on child safety as well as a dedicated engineering team for our safety efforts. Discord is committed to continually exploring new and improved safeguards that help keep younger users safe on our platform and online.

Deceptive Practices

Using Discord for the purpose of distributing malware, sharing or selling game hacks or cheats, authentication token theft, or participating in either identity, investment, or financial scams is a violation of our Community Guidelines. You can read more about our Deceptive Practices policy here.

During the fourth quarter of 2023, 6,470 accounts and 919 servers were removed for Deceptive Practices. While actions taken decreased overall, accounts disabled for malicious links and malware increased.

Exploitative and Unsolicited Content

It is a violation of our Community Guidelines to share or promote sexually explicit content of other people without their consent. You can read more about our Exploitative and Unsolicited Content policies here and here.

During the fourth quarter of 2023, 52,612 accounts and 1,967 serverswere removed for Exploitative and Unsolicited Content. This was an increase of 264% accounts disabled and a decrease of 9% for servers removed. We removed servers proactively 81% of the time for this issue, an increase from 66% of the time in the prior quarter. While servers removed decreased by a moderate amount, accounts disabled increased substantially as the result of increased action against users seeking violative content.

Harassment and Bullying

Harassment and bullying have no place on Discord. Continuous, repetitive, or severe negative comments, circumventing bans, suggestive or overt threats, the sharing of someone’s personally identifiable information (also known as doxxing), and server raiding are violations of our Community Guidelines. You can read more about our Harassment and Bullying policy here, and our Doxxing policy here.

During the fourth quarter of 2023, 10,320 accounts and 542 servers were removed for Harassment and Bullying.

Hateful Conduct

Hate or harm targeted at individuals or communities is not tolerated on Discord. Discord does not allow the organization, promotion, or participation in hate speech or hateful conduct. We define “hate speech” as any form of expression that denigrates, vilifies, or dehumanizes; promotes intense, irrational feelings of enmity or hatred; or incites harm against people on the basis of protected characteristics. You can read more about our Hateful Conduct policy here.

During the fourth quarter of 2023, 3,313 accounts and 674 servers were removed for Hateful Conduct. The rate at which we removed servers proactively for Hateful Conduct was 96%.

Identity and Authenticity

Using Discord for the purpose of coordinating and participating in malicious impersonation of individuals or organizations is a violation of our Community Guidelines. You can read more about our Identity and Authenticity policy here.

During the fourth quarter of 2023, 156 accounts and 135 servers were removed for Identity and Authenticity violations.

Misinformation

It is a violation of our Community Guidelines to share false or misleading information that may result in damage to physical infrastructure, injury to others, obstruction of participation in civic process, or the endangerment of public health. You can read more about our Misinformation policy here.

During the fourth quarter of 2023, 19 accounts and 4 servers were removed for Misinformation.

Platform Manipulation

Spam, fake accounts, and self-bots are examples of platform manipulation that damage the experience of our users and violate our Community Guidelines. You can read more about our Platform Manipulation policy here.

We are focused on combating spam and minimizing users’ exposure to spammers and spam content on Discord. We have a dedicated cross-functional anti-spam team building sophisticated anti-spam measures. You can learn more about how Discord combats spam here, and here about Automod, a safety feature that enables server owners to automatically moderate certain abuse, including spam.

During the fourth quarter of 2023, 2,878 accounts and 724 servers were removed for non-spam related platform manipulation issues. An additional 32,973,473 accounts were disabled for spam or spam-related offenses. This represents an increase of 157% in the number of accounts disabled for spam when compared to the previous quarter. This increase was the result of improvements to proactive scaled abuse detection of fake spam accounts. 99% of accounts disabled for spam were disabled proactively, before we received any user report.

Regulated or Illegal Activities

Using Discord to organize, promote, or engage in any illegal behavior is a violation of our Community Guidelines. You can read more about our Regulated or Illegal Activities policies here and here.

During the fourth quarter of 2023, 32,633 accounts and 13,135 servers were removed for Regulated or Illegal Activities. This was a 31% decrease in the number of accounts disabled, and a 42% decrease in the number of servers removed. Our rate of proactively removing servers increased from 74% to 95%, which we attribute to more fine tuned approaches to our work to detect and remove this behavior.

Self-Harm Concerns

For those experiencing mental health challenges, finding a community that is navigating similar challenges can be incredibly helpful for support. That said, platforms have a critical role to play in ensuring that these spaces do not normalize, promote, or encourage others to engage in acts of self-harm.

We may take action on content that seeks to normalize self-harming behaviors, as well as content that encourages self-harm behaviors or discourages individuals from seeking help for self-harm behaviors. These actions are only taken on accounts glorifying or promoting acts of self-harm, not on users seeking help or in need of medical attention. You may read more about this policy here.

We are proud to partner with Crisis Text Line, a nonprofit that provides 24/7 text-based mental health support and crisis intervention via trained volunteer crisis counselors. If a user reports a message for self-harm on Discord, they will be presented with information on how to connect with a volunteer Crisis Counselor. You can learn more here.

Crisis Text Line is currently available to those in the United States and is offered in both English and Spanish. You can read more about this partnership here. Since the launch of our partnership, there have been over 2,000 conversations started using Discord’s keyword.

During the fourth quarter of 2023, 1,070 accounts and 727 servers were removed for Self-Harm Concerns.

Violent and Graphic Content

Real media depicting gore, excessive violence, the glorification of violence, or animal cruelty is not allowed on Discord. You can read about our Violent and Graphic Content policy here.

During the fourth quarter of 2023, 23,264 accounts and 2,232 servers were removed for Violent and Graphic Content. We proactively removed servers for these issues 96% of the time, an increase from 92% in the previous quarter.

Violent Extremism

We consider violent extremism to be the support, encouragement, promotion, or organization of violent acts or ideologies that advocate for the destruction of society, often by blaming certain individuals or groups and calling for violence against them. You can read more about our Violent Extremism policy here.

By partnering and engaging in cross-industry work with the Global Internet Forum To Counter Terrorism (GIFCT), the European Union Internet Forum, the Christchurch Call, and other organizations, we’ve made progress in our tooling, policy, and subject matter expertise to ensure violent extremism does not have a home on Discord.

During the fourth quarter of 2023, 6,109 accounts and 627 servers were removed for Violent Extremism. We proactively removed servers for Violent Extremism 98% of the time. Our sustained ability to remove violent extremist content proactively can be attributed to our continued cross-industry work to bolster our tooling, policy, and awareness of emerging trends.