Difference between revisions of "Submissions:2023/Credibility Bot Feedback and Brainstorming"
SuperHamster (talk | contribs) (Updates) |
|||
Line 1: | Line 1: | ||
{{WCNA 2023 Session Submission |
{{WCNA 2023 Session Submission |
||
− | |theme=Credibility |
+ | |theme=Credibility / Mis and Disinformation (WikiCred) |
|type=Workshop |
|type=Workshop |
||
|abstract= Help design Credibility Bot, a citation management and review system currently under development and expansion!<br><br>Wikipedia has guidelines intended to ensure articles are supported by reliable sources. How are these guidelines implemented in practice? Wikipedia editors currently try to review articles for quality issues such as inaccurate information or bias; however, since Wikipedia is a large online volunteer community, practices for discussing and maintaining quality are uneven, impromptu, and difficult to scale. Wikipedia editors struggle to get the signals they need to collectively discuss or prioritize their work improving articles. Even when volunteer editors do have the capability to assess sources in one topic area, they often lack the tools to do so across other topics, and ultimately in other languages of Wikipedia.<br><br>Credibility Bot aims to build an easily-installed credibility toolkit for Wikipedia editors. Working closely with committed WikiProjects, it will bring information about sources together with actionable reports and alerts. Credibility Bot can enable the efficient, ongoing evaluation of quality beyond any single WikiProject or subject domain. Credibility Bot data can be archived as a citable, timestamped snapshot, and made globally available for others to use in openly accessible platforms like Wikidata, the credibility toolkit can have a scalable impact for the problem of source quality across the entire Wikipedia ecosystem.<br><br> This is an active user input, feedback, research, and design session where we'll ask questions and hear your thoughts on what and how to build Credibility Bot at scale! |
|abstract= Help design Credibility Bot, a citation management and review system currently under development and expansion!<br><br>Wikipedia has guidelines intended to ensure articles are supported by reliable sources. How are these guidelines implemented in practice? Wikipedia editors currently try to review articles for quality issues such as inaccurate information or bias; however, since Wikipedia is a large online volunteer community, practices for discussing and maintaining quality are uneven, impromptu, and difficult to scale. Wikipedia editors struggle to get the signals they need to collectively discuss or prioritize their work improving articles. Even when volunteer editors do have the capability to assess sources in one topic area, they often lack the tools to do so across other topics, and ultimately in other languages of Wikipedia.<br><br>Credibility Bot aims to build an easily-installed credibility toolkit for Wikipedia editors. Working closely with committed WikiProjects, it will bring information about sources together with actionable reports and alerts. Credibility Bot can enable the efficient, ongoing evaluation of quality beyond any single WikiProject or subject domain. Credibility Bot data can be archived as a citable, timestamped snapshot, and made globally available for others to use in openly accessible platforms like Wikidata, the credibility toolkit can have a scalable impact for the problem of source quality across the entire Wikipedia ecosystem.<br><br> This is an active user input, feedback, research, and design session where we'll ask questions and hear your thoughts on what and how to build Credibility Bot at scale! |
Revision as of 16:56, 7 November 2023
This submission has been accepted for WikiConference North America 2023.
Title:
- Credibility Bot Feedback and Brainstorming
Theme:
- Credibility / Mis and Disinformation (WikiCred)
Type of session:
- Workshop
Abstract:
Help design Credibility Bot, a citation management and review system currently under development and expansion!
Wikipedia has guidelines intended to ensure articles are supported by reliable sources. How are these guidelines implemented in practice? Wikipedia editors currently try to review articles for quality issues such as inaccurate information or bias; however, since Wikipedia is a large online volunteer community, practices for discussing and maintaining quality are uneven, impromptu, and difficult to scale. Wikipedia editors struggle to get the signals they need to collectively discuss or prioritize their work improving articles. Even when volunteer editors do have the capability to assess sources in one topic area, they often lack the tools to do so across other topics, and ultimately in other languages of Wikipedia.
Credibility Bot aims to build an easily-installed credibility toolkit for Wikipedia editors. Working closely with committed WikiProjects, it will bring information about sources together with actionable reports and alerts. Credibility Bot can enable the efficient, ongoing evaluation of quality beyond any single WikiProject or subject domain. Credibility Bot data can be archived as a citable, timestamped snapshot, and made globally available for others to use in openly accessible platforms like Wikidata, the credibility toolkit can have a scalable impact for the problem of source quality across the entire Wikipedia ecosystem.
This is an active user input, feedback, research, and design session where we'll ask questions and hear your thoughts on what and how to build Credibility Bot at scale!
Author name:
E-mail address:
- jakehackshackers.com
Wikimedia username:
- Jake Orlowitz, User:Ocaasi, WikiBlueprint (consulting)
Affiliated organization(s):
- Hacks/Hackers, NewsQ, WikiCred
Estimated time:
- 1 hour
Special requests:
Have you presented on this topic previously? If yes, where/when?:
- No
Okay to livestream?
- Do not livestream
If your submission is not accepted, would you be open to presenting your topic in another part of the program? (e.g. lightning talk or unconference session)
- Yes