Moderation

How we maintain a safe, supportive community while preserving authentic expression

Our Moderation Philosophy

We believe in the power of unfiltered expression, but we also recognize the need for safety and respect. Our moderation approach balances authenticity with community protection, intervening only when necessary to prevent harm while preserving the raw, honest nature of Bluntly.

How Moderation Works

1

Community Reports

Community members report content that may violate our guidelines using the report button.

2

Human Review

Trained human moderators review reported content within context and community guidelines.

3

Contextual Action

Actions are taken based on content severity, community impact, and user intent.

What We Moderate

Immediate Action Required

Safety Threats:

  • • Suicide threats or plans
  • • Threats of violence toward others
  • • Self-harm instructions
  • • Child safety concerns

Illegal Content:

  • • Illegal activities or content
  • • Child exploitation material
  • • Non-consensual intimate images
  • • Doxxing and personal information

Review and Potential Action

Community Disruption:

  • • Spam or repetitive posting
  • • Trolling or inflammatory content
  • • Harassment campaigns
  • • Platform manipulation

Policy Violations:

  • • Attempts to identify users
  • • Commercial advertising
  • • Explicit sexual content
  • • Hate speech targeting groups

Generally Allowed (With Context)

Authentic Expression:

  • • Raw emotions and frustrations
  • • Controversial opinions
  • • Difficult personal experiences
  • • Mental health struggles

Critical Content:

  • • Criticism of public figures
  • • Social and political commentary
  • • Personal conflicts (anonymized)
  • • Workplace complaints

Moderation Actions

Range of Responses

Educational Guidance

For minor violations or misunderstandings, we provide community guidance and education about guidelines.

Content Warnings

For sensitive but allowable content, we may add warnings to help users make informed choices.

Content Removal

Content that violates guidelines is removed with explanation to help users understand the decision.

Access Restrictions

For severe or repeated violations, we may temporarily restrict posting abilities or platform access.

Our Moderation Team

Who We Are

  • • Trained professionals with mental health awareness
  • • Diverse team reflecting our community
  • • Regular training on crisis intervention
  • • Clear escalation procedures for serious issues
  • • Commitment to fair and consistent decisions

How We Protect Anonymity

  • • Moderators cannot identify users
  • • No access to personal information
  • • Focus on content, not individuals
  • • Anonymous alias systems for all interactions
  • • Strict confidentiality protocols

Response Times & Priorities

Our Commitment to Timely Response

Crisis Content

Target: 15 minutes

Suicide threats, violence, child safety

Harmful Content

Target: 2 hours

Harassment, doxxing, serious violations

General Reports

Target: 24 hours

Spam, policy violations, community concerns

Appeals Process

If You Disagree with a Decision

We understand that moderation decisions can sometimes feel unfair. Here's how to appeal:

Appeal Process:

  1. Use the report system to file an appeal
  2. Provide the specific content or action details
  3. Explain why you believe the decision was incorrect
  4. Wait for review by a different moderator

What to Include:

  • • URL or reference to the content
  • • Clear explanation of your perspective
  • • Any relevant context we might have missed
  • • Specific guideline you believe applies

Community Self-Moderation

How You Can Help

Positive Community Actions:

  • • Upvote supportive and helpful content
  • • Provide empathetic responses to struggles
  • • Share helpful resources when appropriate
  • • Model respectful disagreement
  • • Welcome new community members

Reporting Responsibilities:

  • • Report content that violates guidelines
  • • Report crisis situations immediately
  • • Provide context when reporting
  • • Don't abuse the reporting system
  • • Trust the moderation process

Transparency & Accountability

Our Commitments

Regular Reporting:

  • • Monthly community updates on moderation
  • • Statistics on report types and actions
  • • Policy clarifications based on feedback
  • • Appeals outcomes and learnings

Continuous Improvement:

  • • Regular review of moderation decisions
  • • Community feedback integration
  • • Moderator training updates
  • • Policy refinements based on experience

Working Together

Effective moderation is a partnership between our team and the community. Together, we can maintain a space where people feel safe to share their authentic experiences while protecting everyone from harm.