Page values for "Submissions:2023/Introduction to Credibility Bot"
"2023_submissions" values1 row is stored for this page
|Introduction to Credibility Bot
|Credibility / Mis and Disinformation (WikiCred)
Wikipedia has guidelines intended to ensure articles are supported by reliable sources. How are these guidelines implemented in practice? Wikipedia editors currently try to review articles for quality issues such as inaccurate information or bias; however, since Wikipedia is a large online volunteer community, practices for discussing and maintaining quality are uneven, impromptu, and difficult to scale. Wikipedia editors struggle to get the signals they need to collectively discuss or prioritize their work improving articles. Even when volunteer editors do have the capability to assess sources in one topic area, they often lack the tools to do so across other topics, and ultimately in other languages of Wikipedia.Credibility Bot aims to build an easily-installed credibility toolkit for Wikipedia editors. Working closely with committed WikiProjects, it will bring information about sources together with actionable reports and alerts. Credibility Bot can enable the efficient, ongoing evaluation of quality beyond any single WikiProject or subject domain. Credibility Bot data can be archived as a citable, timestamped snapshot, and made globally available for others to use in openly accessible platforms like Wikidata, the credibility toolkit can have a scalable impact for the problem of source quality across the entire Wikipedia ecosystem.
Discussion/workshop following this talk is at Submissions:2023/Credibility_Bot_Feedback_and_Brainstorming
|Jake Orlowitz, WikiBlueprint (consulting)
|List of Email, delimiter: ,
|Hacks/Hackers, NewsQ, WikiCred