Difference between revisions of "Submissions:2023/Credibility Bot Feedback and Brainstorming"

From WikiConference North America
Jump to navigation Jump to search
(correction to etherpad to be used)
 
(6 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
{{WCNA 2023 Session Submission
 
{{WCNA 2023 Session Submission
|theme=Credibility Bot Feedback and Brainstorming
+
|theme=Credibility / Mis and Disinformation (WikiCred)
 
|type=Workshop
 
|type=Workshop
 
|abstract= Help design Credibility Bot, a citation management and review system currently under development and expansion!<br><br>Wikipedia has guidelines intended to ensure articles are supported by reliable sources. How are these guidelines implemented in practice? Wikipedia editors currently try to review articles for quality issues such as inaccurate information or bias; however, since Wikipedia is a large online volunteer community, practices for discussing and maintaining quality are uneven, impromptu, and difficult to scale. Wikipedia editors struggle to get the signals they need to collectively discuss or prioritize their work improving articles. Even when volunteer editors do have the capability to assess sources in one topic area, they often lack the tools to do so across other topics, and ultimately in other languages of Wikipedia.<br><br>Credibility Bot aims to build an easily-installed credibility toolkit for Wikipedia editors. Working closely with committed WikiProjects, it will bring information about sources together with actionable reports and alerts. Credibility Bot can enable the efficient, ongoing evaluation of quality beyond any single WikiProject or subject domain. Credibility Bot data can be archived as a citable, timestamped snapshot, and made globally available for others to use in openly accessible platforms like Wikidata, the credibility toolkit can have a scalable impact for the problem of source quality across the entire Wikipedia ecosystem.<br><br> This is an active user input, feedback, research, and design session where we'll ask questions and hear your thoughts on what and how to build Credibility Bot at scale!
 
|abstract= Help design Credibility Bot, a citation management and review system currently under development and expansion!<br><br>Wikipedia has guidelines intended to ensure articles are supported by reliable sources. How are these guidelines implemented in practice? Wikipedia editors currently try to review articles for quality issues such as inaccurate information or bias; however, since Wikipedia is a large online volunteer community, practices for discussing and maintaining quality are uneven, impromptu, and difficult to scale. Wikipedia editors struggle to get the signals they need to collectively discuss or prioritize their work improving articles. Even when volunteer editors do have the capability to assess sources in one topic area, they often lack the tools to do so across other topics, and ultimately in other languages of Wikipedia.<br><br>Credibility Bot aims to build an easily-installed credibility toolkit for Wikipedia editors. Working closely with committed WikiProjects, it will bring information about sources together with actionable reports and alerts. Credibility Bot can enable the efficient, ongoing evaluation of quality beyond any single WikiProject or subject domain. Credibility Bot data can be archived as a citable, timestamped snapshot, and made globally available for others to use in openly accessible platforms like Wikidata, the credibility toolkit can have a scalable impact for the problem of source quality across the entire Wikipedia ecosystem.<br><br> This is an active user input, feedback, research, and design session where we'll ask questions and hear your thoughts on what and how to build Credibility Bot at scale!
  +
  +
Bot info, slides, and discussion:
  +
* [[w:User:Credibility_bot/WCNA]]
  +
* [https://upload.wikimedia.org/wikipedia/commons/5/53/CREDBOT_Presentation_at_WCNA_2023.pdf Slides for presentation]
  +
* Speaker requests feedback and discussion at this etherpad not the one linked at top: https://etherpad.wikimedia.org/p/CREDBOT
  +
 
|email=jake{{@}}hackshackers.com
 
|email=jake{{@}}hackshackers.com
|username=Jake Orlowitz, User:Ocaasi, WikiBlueprint (consulting)
+
|author=Jake Orlowitz, WikiBlueprint (consulting)
  +
|username=User:Ocaasi
 
|affiliates=Hacks/Hackers, NewsQ, WikiCred
 
|affiliates=Hacks/Hackers, NewsQ, WikiCred
 
|time=1 hour
 
|time=1 hour

Latest revision as of 17:46, 11 November 2023

This submission has been accepted for WikiConference North America 2023.



Title:

Credibility Bot Feedback and Brainstorming

Theme:

Credibility / Mis and Disinformation (WikiCred)

Type of session:

Workshop

Abstract:

Help design Credibility Bot, a citation management and review system currently under development and expansion!

Wikipedia has guidelines intended to ensure articles are supported by reliable sources. How are these guidelines implemented in practice? Wikipedia editors currently try to review articles for quality issues such as inaccurate information or bias; however, since Wikipedia is a large online volunteer community, practices for discussing and maintaining quality are uneven, impromptu, and difficult to scale. Wikipedia editors struggle to get the signals they need to collectively discuss or prioritize their work improving articles. Even when volunteer editors do have the capability to assess sources in one topic area, they often lack the tools to do so across other topics, and ultimately in other languages of Wikipedia.

Credibility Bot aims to build an easily-installed credibility toolkit for Wikipedia editors. Working closely with committed WikiProjects, it will bring information about sources together with actionable reports and alerts. Credibility Bot can enable the efficient, ongoing evaluation of quality beyond any single WikiProject or subject domain. Credibility Bot data can be archived as a citable, timestamped snapshot, and made globally available for others to use in openly accessible platforms like Wikidata, the credibility toolkit can have a scalable impact for the problem of source quality across the entire Wikipedia ecosystem.

This is an active user input, feedback, research, and design session where we'll ask questions and hear your thoughts on what and how to build Credibility Bot at scale!

Bot info, slides, and discussion:

Author name:

Jake Orlowitz, WikiBlueprint (consulting)

E-mail address:

jake@hackshackers.com

Wikimedia username:

User:Ocaasi

Affiliated organization(s):

Hacks/Hackers, NewsQ, WikiCred

Estimated time:

1 hour

Special requests:

Have you presented on this topic previously? If yes, where/when?:

No

Okay to livestream?

Do not livestream

If your submission is not accepted, would you be open to presenting your topic in another part of the program? (e.g. lightning talk or unconference session)

Yes