Trust and Safety
|Trust & Safety||Resources||Programs and Processes|
Trust and Safety (T&S), formerly known as Community Advocacy, CA or SuSa, identifies, builds and – as appropriate – staffs processes which keep our users safe; design, develop, and execute on a strategy that integrates legal, product, engineering, research, and learning & evaluation to proactively mitigate risk as well as manage the overall safety of our online and offline communities when incidents happen. Trust and Safety comprises part of the Community Resilience & Sustainability wing of the Legal Department. We aim to provide compassionate, credible, and comprehensive Trust and Safety services to the Foundation and the volunteer communities and affiliates it supports but much of our time is also just spent "fire-fighting". For more information on the team, see our Overview.
What we do
The Trust and Safety team supports staff, the public and volunteers in our community through approximately 11 workflows in three broad areas. The team is composed of two sub-teams: Policy and Operations. You can find more details in under Programs and Processes.
Trust and Safety
The Wikimedia Foundation aims to defer to local and global community processes to govern on-wiki interactions. However, at times, we must step in to protect the safety and integrity of our users, our contributors, and the public. We support a healthy environment on our projects through several work areas. Among other measures, we receive and handle reports of major safety issues on Wikimedia projects, including suicide threats, threats of violence, and child pornography. We also own the policies regarding Wikimedia Foundation bans of users from the projects and from Foundation-funded or supported events, and we work with other Foundation teams to address concerns about user privacy and freedom that do not necessarily rise to the level of bans.
As a part of the Foundation’s commitment to respect community autonomy, the Trust & Safety team does not handle general community or community-member disputes that may be addressed through community processes, nor does it serve as an appeal venue for community-made policies and decisions. While we are happy to assist community members in need of help, many times that help will consist of assisting the person to find the right community venue to solve their problem.
Regular workflows include:
Direct community support
Other regular workflows include:
We provide guidance, advice and support to Foundation staff, the Board, and committees. We assist staff routinely with community and content related concerns, including processing DMCA takedown and notification requirements and, where necessary, responding to search warrants and legally valid subpoenas. We manage requests for advanced user rights required for staff members to do their work by assessing needs and liaising with the stewards.
Regular workflows include:
Office actions workflow
More information: Office actions
The process leading up to an office action varies considerably based on the action and the circumstances surrounding it. The strongest actions in common use are those taken against users of the websites, typically in the form of global or event bans. These actions are the result of user conduct investigations undertaken by T&S Specialists, which go through a rigorous review cycle as documented in the flowchart to the right.
Other office actions can include deletions of illegal material. This typically consists of sensitive images of minors which violate the laws of the United States. T&S also performs deletions to satisfy the Foundation's DMCA Policy, an archive of which is maintained on Foundation wiki.
In September 2020, a case review committee was created to allow directly involved community members to request review of a community committee of Trust & Safety behavioral investigation outcomes. This committee is equipped to review certain office actions on appeal from individuals directly involved in the case (as the requesting or sanctioned party). For more information, see Office actions#Appeals. This will remain in place until a permanent process is created through the Universal Code of Conduct conversations in 2021.