Content Moderator

March 3, 2026
Application ends: June 1, 2026

Job Description

JOB DETAILS


REQUIREMENTS

  • Minimum of 4 years of experience in Content Moderation, Trust & Safety Operations, or community Management for a major tech/social media platform.
  • Ability to spot subtle violations that automated systems miss (e.g., hate symbols hidden in images).
  • Comfortable using moderation tools (e.g., Hive, Besedo, Salesforce) and Google Workspace.
  • Experience handling spikes in content volume during global events or viral challenges.
  • High level of emotional fortitude. Must be comfortable reviewing disturbing content (violence, hate speech, adult content) and have proven strategies for digital wellness.
  • Deep understanding of regional nuances, cultural sensitivities, and historical contexts. Ability to distinguish between hate speech and protected political speech, or between violent extremism and documentary/news content.
  • Proven track record of maintaining quality metrics while processing a high volume of content (e.g., 80-100+ pieces per hour).
  • Ability to stay focused during repetitive tasks without losing attention to detail.

RESPONSIBILITIES

  • Review and act on reported content, including text, images, and videos, ensuring it meets platform guidelines. Focus will be on high-priority queues and edge cases that require human judgment.
  • Monitor daily queues to identify new patterns of abuse (e.g., new spam techniques, coordinated hate campaigns) and escalate them to the Policy team immediately.
  • Provide feedback on the moderation tool efficiency. Suggest changes to workflows that can increase review speed without sacrificing accuracy.
  • Maintain a high accuracy rate (95+) on all moderation decisions. Participate in calibration sessions with the team to ensure consistency in applying policies.
  • Provide constructive feedback to Policy teams when guidelines are unclear or conflict with real-world context, helping to refine the rulebook for thousands of moderators.
  • Investigate cases where content was removed or accounts were suspended, making final determinations on reinstatement requests with a focus on fairness and due process.
  • Serve as a designated responder during “red alert” situations, such as graphic live-streamed events or coordinated harassment campaigns.

Are you interested in this position?


Apply by clicking on the “Apply Now” button below!

#CrossChannelJobs #JobSearch
#CareerOpportunities #HiringNow
#Employment #JobOpenings
#JobSeekers
#FacebookLinkedIn