Nachmoderation
NachModeration, or post-moderation, is a content moderation approach in which material that has already been published is subjected to additional evaluation against platform policies, community standards, and applicable law. It complements pre-moderation and real-time screening by providing a retrospective check to address evolving concerns and new information.
Mechanisms include user reports, automated detection signals, and periodic audits by human moderators. Content flagged for
Outcomes can range from removal or restriction to updates to labeling, warning notices, or reinstatement. NachModeration
Applications are used in social networks, forums, and video platforms to mitigate harm from misinformation, harassment,
Challenges include potential delays between posting and enforcement, reliance on signal quality, and the risk of