Create 2019/State of Wikimedia Research on Knowledge Integrity: Submissions:2021/Editing seems too hard

Jump to navigation Jump to search
You do not have permission to edit this page, for the following reason:

The action you have requested is limited to users in the group: Users.


This submission has been accepted for WikiConference North America 2019.



Title:

Editing seems too hard

Theme:

Reliability of Information
+ Tech & Tools

Type of session:

Presentation

Abstract:

This presentation will give an overview of recent research conducted by the Wikimedia Research team [1]. Specifically, I will discuss several projects in the area of knowledge integrity [2] -- i.e. what interventions or technologies would help to preserve the reliability of content on Wikimedia projects. A few recent projects are described below:

Citations are core to the verifiability of Wikipedia content but little is known about how readers interact with them. To better understand when citations are checked by readers, logging was implemented to capture the different ways in which Wikipedia readers might interact with citations [3]. While most readers do not click on citations, certain topics and references lead to more clicks than others.

Patrolling is incredibly important to maintaining the quality of content on Wikimedia projects and protecting against issues such as vandalism, misinformation, or copyright infringement. We have begun research to better understand the needs, priorities, and workflows of editors who patrol new content on Wikimedia projects [4]. This work will hopefully support the development of technologies or other interventions to support this important work.

Sockpuppet accounts (multiple accounts used by the same person) can be used maliciously for pushing one's point-of-view or creating large numbers of articles that do not meet the inclusion criteria for Wikipedia. Sockpuppet accounts are difficult to detect, however, because users of these accounts go to great lengths to hide their actions and there are many benign reasons why editors might edit the same articles or agree in discussions. To help the editor community detect these sockpuppet accounts, we have been exploring how machine learning models might assist with detecting sockpuppet accounts on Wikipedia [5].

[1] https://research.wikimedia.org/

[2] https://wikimediafoundation.org/news/2019/02/14/research-directions-towards-the-wikimedia-2030-strategy/

[3] https://meta.wikimedia.org/wiki/Research:Characterizing_Wikipedia_Citation_Usage

[4] https://meta.wikimedia.org/wiki/Research:Patrolling_on_Wikipedia

[5] https://meta.wikimedia.org/wiki/Research:Sockpuppet_detection_in_Wikimedia_projects

Academic Peer Review option:

No

Author name:

Isaac Johnson

E-mail address:

isaac@wikimedia.org

Wikimedia username:

Isaac_(WMF)

Affiliated organization(s):

Wikimedia Foundation

Estimated time:

20-30 minutes

Preferred room size:

Preferably at least room for 30 people

Special requests:

None -- will be a standard slide show

Have you presented on this topic previously? If yes, where/when?:

The Research team periodically reports on our research but we constantly update what we discuss based on new projects. In this case, we are specifically focusing on our research around content integrity as this aligns best with Wikiconference North America's theme.

If your submission is not accepted, would you be open to presenting your topic in another part of the program? (e.g. lightning talk or unconference session)

Yes, I would consider arranging a meet-up around this topic as well but in our experience, we reach a more narrow segment of the community when our sessions are more ad-hoc.




Cancel