Submissions:2018/Aligning technical solutions and policy to combat harassment on Wikimedia projects
- Aligning technical solutions and policy to better combat harassment on Wikimedia projects
- Theme (optional)
- Harassment, Civility, & Safety; Tech & Tools
- Type of submission
- Sydney Poore
- E-mail address
- Wikimedia username
- SPoore (WMF)
- Notes documents
- Affiliation(s) (optional)
- Wikimedia Foundation
Online harassment is a problem on virtually every web site where users interact. In 2017, the Pew Research Center concluded that 41% of all internet users have been the victim of online harassment. In 2015 the Wikimedia Foundation conducted a Harassment Survey with 3,845 Wikimedia user participants to gain deeper understanding of harassment occurring on Wikimedia projects. 38% of the respondents confidently recognized that they had been harassed while 51% of respondents witnessed others being harassed. In Wikimedia's 2017 Community Engagement Insights Report, it was found that 31% of all 4,500 survey respondents felt unsafe in any Wikimedia online or offline space at any time during their tenure, 49% of 400 users avoided Wikimedia because they felt uncomfortable, and 47% of 370 users indicated that in the past 12 months they had been bullied or harassed on Wikipedia. Furthermore, 60% of people who reported a dispute to functionaries say their issue was "not at all resolved" and 54% called their interaction with functionaries "not at all useful."
The Wikimedia Foundation Anti-Harassment Tools team is building technical tools that help contributors and administrators to make more timely and better informed decisions when harassment and abuse occurs. The Wikimedia Foundation Trust & Safety team is working Wikimedia communities to ensure their user conduct policies are clear and effective, and administrators a well prepared to enforce policy.
The English-language Wikipedia community (and most other Wikimedia projects) have conduct policies for their communities to follow including policies on civility, harassment, personal attacks, and dispute resolution with established processes and workflow to address these them. The introduction of new or improved technical feature may introduce the need for Wikimedia communities to review and rewrite existing policy, processes, and workflows.
This workshop will review the tools that are already built by AHT to address harassment and abuse on Wikimedia websites. Additionally, this workshop will kickoff a community wide discussion about how the new and forthcoming tech improvements to software could potentially alter on wiki processes and workflows, and therefore generate the need for updated or new policies.
Attendees should leave with a better understanding of the new features and improvements to existing software such as the Interaction Timeline, Per-user page, namespace, and upload blocking], Echo notifications mute and Special:EmailUser Mute and restrict Special:email for new user accounts. and how these changes could alter the workflows and processes on their home communities. Additionally, attendees should be better prepared to participate on their home wiki in discussions related to rewriting relevant policy.
- Length of presentation
- 75 minutes
- Special requests
- Preferred room size
- Have you presented on this topic previously? If yes, where/when?
- If you will be incorporating a slidedeck during your presentation, do you agree to upload it to Commons before your session, with a CC-BY-SA 4.0 license, including suitable attribution in the slidedeck for any images used?
- Will you attend WikiConference North America if your submission is not accepted?
If you are interested in attending this session, please sign with your username below. This will help reviewers to decide which sessions are of high interest. Sign with four tildes. (~~~~).
- Minh Nguyễn 💬 06:14, 23 August 2018 (UTC)
- Katietalk 16:11, 12 September 2018 (UTC)
- Add your username here.