discord.aass-1234561.workers.dev Open in urlscan Pro
172.67.137.31  Public Scan

Submitted URL: http://discord.aass-1234561.workers.dev/moderation/1500000178101-303-facilitating-positive-environments
Effective URL: https://discord.aass-1234561.workers.dev/moderation/1500000178101-303-facilitating-positive-environments
Submission: On June 19 via api from US — Scanned from DE

Form analysis 0 forms found in the DOM

Text Content

DownloadNitroDiscoverSafety
Safety
SupportBlogCareers
Download
Back
Safety Center
Overview

Controlling Your Experience
Four steps to a super safe accountFour steps to a super safe serverRole of
administrators and moderators on DiscordReporting problems to DiscordMental
health on DiscordAge-Restricted Content on DiscordTips against spam and hacking

Parents & Educators
What is Discord?Discord's commitment to a safe and trusted experienceHelping
your teen stay safe on DiscordTalking about online safety with your
teenAnswering parents' and educators' top questionsIf your teen encounters an
issueWorking with CARU to protect users on Discord

How We Enforce Rules
Our policiesHow we investigateWhat actions we takeHow you can appeal our
actionsDiscord's Transparency ReportsWorking with law enforcement
Back
Moderator Academy
Overview

Basics
100: An Intro to the DMA103: Basic Channel Setup104: How To Report Content To
Discord110: Moderator Etiquette111: Your Responsibilities as a Moderator151: An
Intro to the Moderator Ecosystem

Setup and Function
201: Permissions on Discord202: Handling Difficult Scenarios203: Developing
Server Rules204: Ban Appeals205: Utilizing Role Colors206: Best Practices for
Reporting Tools207: Server Information and Announcement Channels208: Channel
Categories and Names210: Moderator Recruitment211: Creating Moderation Team
Channels231: Fundamentals of Family-Friendly Servers241: Securing Your Discord
Account

Advanced Community Management
301: Implementing Verification Gates302: Developing Moderator Guidelines303:
Facilitating Positive Environments304: Moderating Safely and Securely310:
Managing Moderation Teams311: Understanding and Avoiding Moderator Burnout312:
Internal Conflict Resolution313: How to Moderate Voice Channels314: Training and
Onboarding New Moderators321: Auto Moderation in Discord322: Using Webhooks and
Embeds 323: Using XP Systems324: Using Modmail Bots331: Community Engagement332:
Fostering Healthy Communities333: Planning Community Events334: Community
Partnerships341: Understanding Your Community Through Insights345: Best
Practices for Moderating Content Creation

Moderation Seminars
401: Transparency in Moderation402: Confidentiality in Moderation403: Sensitive
Topics404: Considering Mental Health in Your Community 405: Practicalities of
Moderating Adult Channels407: Managing Exponential Membership Growth431: Ethical
Community Growth432: Internationalization of a Community441: Community
Governance Structures442: Using Insights to Improve Community Growth and
Engagement443: Ban Evasion and Advanced Harassment444: Managing Interpersonal
Relationships451: Reddit X Discord452: Twitch X Discord453: Patreon X
Discord455: Schools X Discord459: Bringing Other Communities to Discord

Graduate
531: Parasocial Relationships541: The Application of Metaphors in Moderation


Author Credits
Author Credits
Login


Discord Safety CenterPolicy HubSafety LibraryFacilitating Positive Environments
Discord
Version
No items found.
June 3, 2022



FACILITATING POSITIVE ENVIRONMENTS

‍‍


INTRODUCTION

The core foundation of a server on Discord is the community that populates it.
Your community is what you engage with, protect, and grow over time. Engagement
is important to focus on, but it’s just as important to make sure you are
facilitating positive and welcoming engagement.




WHY POSITIVE ENVIRONMENTS ARE IMPORTANT

Positive engagement can mean a lot of things, but in this article, we will be
referring to the way in which moderation can affect the culture of the server
you are moderating. As moderators your policies, knowledge of your community,
and deductive skills influence the way in which your community engages with each
other and with your team.

When you establish and nurture your community, you are growing a collective
group of people who all enjoy at least some of the same things. Regardless of
your server topic, you are undoubtedly going to have members across different a
variety of ethnicities, sexual orientations, and identities from across the
world. Ensuring that your space on Discord is a space for them to belong
necessitates making it safe for them to feel like they can be themselves,
wholly, and without reservation. Your members are all humans, all community
members, all people that deserve respect and deserve to be welcomed.

‍

‍


ESTABLISHING COMMUNITY BOUNDARIES IN MODERATION

When you are establishing your community, it’s important to have a basic
understanding of what kind of environment you would like your server to be. It’s
good to break down the general moderation philosophy on what content and
discussion you’d like your community to engage in and what content would be
inappropriate given the space. Depending on the topic of your server these goals
may be different, but some common questions you can ask to establish general
boundaries are:

 * What is the main topic of my server? When you’re thinking about the community
   and their impact on the growth of your server, it’s important to deduce what
   kind of server you want to build on a base conceptual level. If, for example,
   you are creating a politically-driven server, you might have different limits
   and expectations content and conversation wise for your community than a
   server based on Tetris or pets.
 * What topics do I expect users to engage in? Some servers will have the
   expectation that members will be allowed to discuss more sensitive or
   controversial and thought provoking topics, while others may feel as if these
   kinds of heavy debates are out of place. Video game servers tend to have a
   no-politics rule to avoid negative debates and personal attacks that are
   beyond the scope of the video game(s) in question. Servers centered around
   memes, irl, or social communities can be much more topical and have looser
   rules, while servers centered around mental health or marginalized
   communities can lean towards a stricter on-topic only community policy.
 * What would I like to foster in my community? While knowing what to avoid and
   moderate is very useful, having an idea of what kind of atmosphere you’d like
   the server to have goes far in setting the mood for the rest of the community
   at large. If users notice moderators are engaging in good-faith and positive
   conversations and condemning toxic or hateful discussion, it is more likely
   that your users will join in and participate in that positive conversation.
   If they see you and your mod team have taken the initiative to preserve the
   good atmosphere of the community, they are moved to put in the effort to
   reciprocate.

‍

‍


MODERATING HATEFUL CONTENT

When it comes to the content you allow or moderate in your server, it’s
important to, again, reflect on what type of community you are. It’s also
important that you act quickly and precisely on this type of harmful behavior.
Some users will slowly push boundaries on what type of language they can ‘get
away with’ before being moderated.

When discussing moderation, a popular theory that circulates is called the
broken windows theory. This theory expresses that if there are signs of
antisocial behavior, civil unrest and disorder, as well as visible signs of
crimes in the area, that it encourages further anti-social behavior and crime.
Similarly, if you create an environment in which toxic and hateful behavior is
common, the cycle will perpetuate into further toxicity and hatefulness.




WHAT IS BAD-FAITH CONTENT VS. GOOD-FAITH CONTENT?

‘Bad-faith’ content is a term that describes behavior done intentionally to
cause mischief, drama, or  toxicity to a community. They are also commonly
referred to as bad actors, and are the type of people that should be swiftly
dealt with and addressed directly.

‘Good-faith’ content is a term that describes user behavior with good
intentions. When users are a positive foundation in your community, the members
that join and interact with the established community will grow to adapt and
speak in a way that continues the positive environment that has been fostered
and established. It’s important to note that while ‘good-faith’ users are
generally positive people, it is possible for them to state wrong or sometimes
even harmful words. The importance of this distinction is that these users can
be educated from their mistakes and adapt to the behavior you expect of them.

When users toe the line, they are not acting within good faith. As moderators,
you should be directly involved enough to determine what is bad-faith content
and remove it. On the other hand, education is important in the community sphere
for long term growth. While you can focus on removing bad behavior from
bad-faith users, reform in good-faith community members who are uneducated in
harmful rhetoric should also be a primary goal when crafting your community.
When interacting in your community, if you see harmful rhetoric or a harmful
stereotype, step back and meaningfully think about the implications of leaving
content up in channels that use this kind of language. Does it:

 * Enforce a negative stereotype?
 * Cause discomfort to users and the community at large?
 * Create a negative space for users to feel included in the community?
   




IDEAS TO HELP PRIORITIZE INCLUSIVITY

 * Allowing users to have pronouns on their profile. Depending on your server,
   you may choose to have pronoun roles that members can directly pick from to
   display on their profile. This is a way to allow users to express their
   pronouns in a way that doesn’t isolate them. When creating a larger, more
   welcoming system for pronouns, it is much harder to decide who has pronouns
   because they are LGBTQ+, because they’re an ally, or just because it was part
   of setting up their roles. When servers have pronoun systems built into them,
   this can also allow for a community-wide acceptance of pronouns and respect
   for other users’ identities, and can deter transphobic rhetoric.
 * Discourage the use of harmful terms. It’s no secret that terms such as
   ‘retard’ and ‘trap’ are used in certain social circles commonly. As
   moderators, you can discourage the use of these words in your community’s
   lexicon.
 * Create strong bot filters. Automated moderation of slurs and other forms of
   hate speech is probably your strongest tool for minimizing the damage bad
   actors can create in your server. Add variating ways people commonly try to
   skip over the filter as well (for example, censoring a word with an added or
   subtracted letter that commonly is used as a slur).
 * A good document to follow for bot filter and auto moderation as a whole can
   be found here!
 * Educating your community. Building a community without toxicity takes a lot
   of time and energy. The core of all moderation efforts should be in educating
   your communities, rewarding good behavior, and making others aware of the
   content they are perpetuating.

A core way to handle all de-escalation stands in your approach. Users, when
heated up during a frustrating or toxic discussion, are easy to set off or to
accidentally escalate to more toxicity. The key is to type calmly, and to make
sure with whatever manner you approach someone to de-escalate, you do it in a
way that is understood to be for the benefit of everyone involved.





CLOSING

Creating a healthy community that leaves a lasting, positive impact in its
members is difficult. Moderators have to be aware, educated, and always on the
lookout for things they can improve on. By taking the initiative on this front,
your community can grow into a positive, welcoming place for all people,
regardless of their race, gender, gender identity, or sexual orientation.







Tags:
Moderation
Server Safety
Contents
IntroductionWhy Positive Environments are ImportantEstablishing Community
Boundaries in ModerationModerating Hateful ContentWhat is Bad-Faith Content vs.
Good-Faith Content?Ideas to Help Prioritize InclusivityClosing


SAFETY CENTER

Explore more
Controlling Your Experience
Four steps to a super safe account


LOREM IPSUM IS SIMPLY




BUILDING A SAFER PLACE FOR TEENS TO HANG OUT


HATEFUL CONDUCT POLICY EXPLAINER


SCAMS AND WHAT TO LOOK OUT FOR

English, USA
българскиČeštinaDanskDeutschΕλληνικάEnglish,
USAEspañolSuomiFrançaisहिंदीHrvatskiMagyarItaliano日本語한국어LietuviškaiNederlandsNorwegianPolskiPortuguês
do BrasilRomânăРусскийSvenskaไทยTürkçeУкраїнськаTiếng Việt中文繁體中文
English
Čeština
Dansk
Deutsch
English
English (UK)
Español
Español (América Latina)
Français
Hrvatski
Italiano
lietuvių kalba
Magyar
Nederlands
Norsk
Polski
Português (Brasil)
Română
Suomi
Svenska
Tiếng Việt
Türkçe
Ελληνικά
български
Русский
Українська
हिंदी
ไทย
한국어
中文
中文(繁體)
日本語

Product
DownloadNitroStatusApp DirectoryNew Mobile Experience
Company
AboutJobsBrandNewsroom
Resources
CollegeSupportSafetyBlogFeedbackStreamKitCreatorsCommunityDevelopersGamingQuestsOfficial
3rd Party Merch
Policies
TermsPrivacyCookie SettingsGuidelinesAcknowledgementsLicensesCompany Information
Sign up