2019/Grants/Disinformation and Its Miscontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience

From WikiConference North America
< 2019‎ | Grants
Jump to navigation Jump to search


Title:

Disinformation and Its Miscontents: Narrative Recommendations on Wikipedia's Vulnerabilities and Resilience

Name:

Jake Orlowitz

Wikimedia username:

Ocaasi

E-mail address:

jorlowitz@gmail.com

Resume:

Geographical impact:

Global

Type of project:

Research + Output

What is your idea?

This will be an investigative exploration, a narrative project with the goal of practical recommendations for implementation and expansion. The focus is on five core questions:

  • What policies, practices, and tools make some Wikipedias more robust against disinformation?
  • How do we vet neutrality and reliability of sources on articles about polarizing or controversial topics across multiple languages?
  • How are smaller or less active Wikipedias and Wiki communities more vulnerable to disinformation?
  • What trainings and tools would improve our resilience to disinformation attempts?
  • How do we share our expertise about disinformation throughout the movement and beyond?

First, we must understand what disinformation means and how it operates. Then we must investigate the community members whose regular practice involves fighting disinformation (a preliminary list of interviewees can be provided to the grants committee to protect confidentiality). We must look to Foundation staff and outward to external experts who are actively planning ways to combat disinformation. Finally, we must compile, analyze, and recommend actionable interventions.

Targeted solutions range from appropriate policies to citation tagging to community noticeboards to project trainings. One of the fundamental outcomes of this project is to identify different approaches to fighting disinformation, speak with those pioneering the approaches, contextualize where each approach would have the most value, and work to create a vision for how the approaches can work together to address disinformation.

Which of these solutions are most effective? How are they currently employed, on which projects, and by whom? A major goal of this project is to explore each area of intervention to determine its usefulness and its relation to other approaches. The best way to do this is direct conversation with the people leading these efforts.

Why is it important?

Disinformation is an urgent topic of study and concern, sadly because it's eroding our civil society and trust in media through viral activity on massive social networks. Misinformation is defined as merely wrong information, and may or may not be intentional. Disinformation, however, is intentionally wrong--and also harmful.

On Wikipedia, disinformation can come from good-faith editors who (mis)use sources that are actually (dis)information, single-purpose accounts that seek to skew and bias articles, paid editors with a financial interest in promoting client interests, coordinated groups that look to "brigade" topics where they have an agenda, and State Actors where governments seek to undermine political dissent or insert propaganda.

When it comes to this massive existential problem, we don't know its scope and we haven't identified the range of solutions. We haven't spoken with enough experts, or synthesized their wisdom and guidance. We have yet to distill community knowledge and connect people working on the same or similar issues from different perspectives.

And for highly sophisticated disinformation campaigns by paid hackers or intelligence agencies funded by foreign governments, we are not yet in a position to say whether there have been serious attempts, or if so, at what scale.

Our vulnerabilities may be overstated, or, we could be dangerously blind to what is happening already and what is coming. The nature of Wikipedia's anonymous or pseudonymous contributors could allow a well-funded targeted effort to develop trusted users who could be influential; or to compromise existing trusted users (e.g. admins).

We need to do deeper research, because we don't know what we don't know, and covert sophisticated campaigns would be, well, covert and sophisticated.

Is your project already in progress?

I have applied for complimentary but not sufficient funding through Wikimedia's Project Grants. I have not received any notifications about whether the project has been accepted.

https://meta.wikimedia.org/wiki/Grants:Project/Ocaasi/Misinformation_And_Its_Discontents:_Narrative_Recommendations_on_Wikipedia%27s_Vulnerabilities_and_Resilience

The project, though proposed, has not yet begun.

How is it relevant to credibility and Wikipedia? (max 500 words)

We are at the frontier of a disturbing trend in our digital information age, and we are working blind. We don't know the nature of the problem, we don't have a roadmap for the fixes, and we don't have a story that ties them together.

Wikipedia is generally considered a generally reliable source. Wikipedia maintains this reputation despite a long history of edit wars at controversial articles. In many areas the encyclopedia has demonstrated remarkable capacity against attempts to introduce falsehoods and bias it. In other cases, however, even simple hoaxes show off the gaps in our defenses and their broader negative impact.

Considering the wholesale havoc which misinformation and disinformation has wrought on sites such as Youtube, Facebook, Twitter, and WhatsApp, Wikipedia may even be in a relatively enviable position. At the same time, an encyclopedia holds itself to a higher standard of reliability than other social web properties--and perception of trustworthiness is arguably more important for a project that prides itself on good information.

As information spreads with increasing speed through social networks, as news outlets struggle to combat lies and propaganda, and as political and government attempts to bias the information landscape become more widespread and complex, how do we as a neutral encyclopedia address the growing threat of disinformation on our own projects?

There are troubling and recent examples: a made-up Nazi Death Camp", inserting pro-gun bias, whitewashing celebrity scandals, pro-Iranian networks, Coronavirus disinformation, disappearing concentration camps, fake chemical plant explosions, blamed missile attacks on planes, state propaganda, hoaxes about famous accessories, false rap album release dates, cryptocurrency lies, government censorship, and historical revisionism.

Whatever defenses we have amassed, we are still under constant attack.

What is the ultimate impact of this project?

I will do research to provide a fuller picture around the problem of disinformation on Wikimedia projects: where it is most severe, and what current factors are making it less so. I will tie together multiple threads of media, research, and community knowledge to create an enlightened narrative to guide discussions and future interventions. As a result, our community will be better informed and prepared to address a growing threat to our reputation and integrity.

I will investigate and distill broader expertise through targeted interviews with community members, staff, scholars, and organizations about disinformation. By deeply researching what the most experienced people on the subject know, I will capture the wisdom, innovations, and leadership residing inside out and outside of our projects. As a result, we will highlight our community knowledge and transform that wisdom into actionable insights.

I will provide clear recommendations for most effectively addressing disinformation, with proposed pathways towards implementation. I will compare and connect different approaches and analyze which are appropriate in different circumstances and how they can work together. As a result, community and staff should be able to better envision, plan, and execute next steps for their interventions.

I will share knowledge gained from the project through blog posts and conference presentations. I will use social media to spread learnings and further invite commentary. As a result, more people will know about the findings and know they have contacts who can connect them to resources and other experts.

Could it scale?

The interviews in this project will draw from many Wikimedia projects, not just English or 'Global North' communities. There will be a significant focus on smaller and 'Global South' projects and community leaders.

The recommendations will be generally applicable to all communities as a menu of potential tactics, tools, and interventions that they could apply--and why they would apply them in varying contexts.

Also, these potential fixes need people to advance them to the next steps of viability and usefulness. Help to do so could come from any corner of the global Wikimedia volunteer universe and ecosystem.

The findings of this project will be widely disseminated, and relevant to every Wikipedia project, because none are free from the threats and risks of disinformation.

Why are you the people to do it?

I (User:Ocaasi) founded The Wikipedia Library and ran it from 2011-2019. By the time I left the program at the Wikimedia Foundation, TWL had a half-million dollar budget and 6-person team on 4 continents. Through The Wikipedia Library, I developed partnerships with 70 leading scholarly publishers to provide free access to 100,000 scholarly journals and reference texts. 25,000 editors now have access to those sources through the Wikipedia Library Card Platform.

I created the viral #1Lib1Ref and #1Bib1Ref citation campaigns, which now add 10-20 thousand new references each year from librarians around the world to Wikipedia. I started the Wikipedia Visiting scholar program, the Books & Bytes newsletter, the Wikipedia + Libraries facebook group, the Wikimedia and Libraries Usergroup, and the @WikiLibrary Twitter account.

I negotiated the collaboration with Turnitin to fix copyright violations on Wikipedia, started collaboration with Internet Archive to rescue 10 million dead citation links, integrated OCLC ISBN citation data into Wikipedia's reference autogeneration interface, and began a project to add Citoid to Wikidata. I developed the OAbot web app, and is a founding member of the Open Scholarship Initiative. I co-released a dataset of Wikipedia's most cited sources and the proportion of free-to-read sources on Wikipedia.

I created The Wikipedia Adventure interactive guided tutorial and facilitated the first-ever for-credit Wikipedia editing course at Stanford Medical School. I am an English Wikipedia Administrator. 2-time Wikimedia Foundation grantee, former Individual Engagement Grants Committee member, founding board member of Wiki Project Med Foundation, former Organizing Committee member for Wikicite, Linked Data 4 Libraries Program Committee member, and founder of the Wikimedia Foundation's Knowledge Integrity Program.

I have presented about Wikipedia, citations, and reliability at five Wikimanias, Stanford University, Internet Librarian, the American Library Association, OCLC, and IFLA. I am a primary author of "The Plain and Simple Conflict of Interest Guide", "Conflict of Interest editing on Wikipedia", "Librarypedia: The future of Libraries, and Wikipedia", "The New Media Coalition Horizon Report for Libraries", "The Wikipedia Adventure: Field Evaluation", "Writing an open access encyclopedia in a closed access world", "The Wikipedia Library: The world's largest encyclopedia needs a digital library, and we are building it", "You're a researcher without a library: what do you do?", the Wikipedia "Research Help" portal, "Why Medical Schools Should Embrace Wikipedia", and the forthcoming Wikipedia @20 chapter "How Wikipedia Drove Professors Crazy, Made Me Sane, and Almost Saved the Internet."

I have been interviewed by Publishers weekly in "Discovery Happens Here", Tow Journalism School for "Public Record Under Threat", and I was featured in the documentary "Paywall: The Business of Scholarship".

What is the impact of your idea on diversity and inclusiveness of the Wikimedia movement?

Larger Wikipedias are likely doing better than most in robustness and resilience against misinformation merely due to the size of their active contributor base. Yet even our larger wikis are not immune from nefarious attempts to bias them, and the vulnerability of smaller wikis with many fewer editors, patrollers, and tools is presumably far greater.

Both large and small communities are vulnerable, and disinformation threatens the integrity and viability of every Wikipedia.

All Wikipedia communities need to be able to be trusted, and have robust defenses against attempts to undermine their volunteer's diligent and dedicated labor.

Disinformation wastes volunteer time and ruins the reputation of projects that need to attract volunteers. The less people trust Wikipedia, the less they will be willing to give their effort to the project.

Also, some disinformation perpetuates myths and systemic biases against marginalized individuals or nations or topics.

Finally, many people who fight for radical transparency are from marginalized communities. They may be personal targets of disinformation efforts.

What are the challenges associated with this project and how you will overcome them?

Disinformation is a contemporary, popular, and sprawling topic. The main challenge of this grant is to remain focused on the area where Dfisinformation intersects with Wikipedia, so that its findings are relevant, specific, and impactful.

A related challenge is to ensure this project is weighted towards actionable recommendations (action research) rather than just a general overview of the subject (long-form journalism). Action-research is an established methodology which combines ethnographic aspects of understanding community practices with direct intention to influence the effectiveness of that community.

A last challenge, far less likely, is that by revealing our vulnerabilities and defenses, we could possibly make it easier for bad actors to breach them. Still, we absolutely need to know where we are at risk, so we can address those risks. It's essential to open access and open source practices and philosophy that you expose your vulnerabilities for fixing; you don't hide them in a naive and ineffectual obscurity and hope no one discovers them.

How much money are you requesting?

$10,000

How will you spend the money?

  • Disinformation landscape review: $1,000
  • Interviews with Wikipedians: $3,000
  • Interviews with Wikimedia Foundation Staff: $2000
  • Interviews with external media experts: $1,000
  • Qualitative synthesis of interviews: $1,000
  • Review of potential interventions: $1,000
  • Recommendations for tools and training: $1,000

TOTAL: $10,000

How long will your project take?

6-12 Months

Have you worked on projects for previous grants before?

I was a recipient of two Wikimedia Foundation Individual Engagement Grants:

1) The Wikipedia Adventure

2) The Wikipedia Library

3) Distributing Grants:

  • I was a two-time Committee Member for Wikimedia Foundation's Individual Engagement Grants
  • I was a project advisor for the Medical Translation Taskforce Wikimedia grant
  • I was a board member for Cochrane Game Changer's $2.5 million medical journals innovation project