Submissions:2014/Answering Big Questions With Wikidata

From WikiConference North America
Revision as of 14:44, 3 April 2014 by Timathom (talk | contribs) (Interested in attending!)
Jump to navigation Jump to search
Title of the submission

Answering Big Questions With Wikidata

Themes (Proposal Themes - Community, Tech, Outreach, GLAM, Education)

Tech

Type of submission (Presentation Types - Panel, Workshop, Presentation, etc)

Workshop

Author of the submission

Max Klein

E-mail address

isalix@gmail.com

Username

w:User:Maxmilianklein

US state or country of origin

California

Affiliation, if any (organization, company etc.)

None

Personal homepage or blog

http://notconfusing.com

Abstract (at least 300 words to describe your proposal)

Wikidata is live and starting to fulfill its potential as a data repository, but its also becoming more than the sum of its data. Wikipedia in its early stages started to see success as an encyclopedia, but then also proved invaluable for researchers as a way to understand online collaboration, and how people interacted around free-text content. Likewise the aspects of what Wikidata can tell us as a corpus, not just its individual facts, is slowly emerging. From knowing which Wikipedias have the highest percentage of their Biography articles about women, to visualizing the planet with geodata, new world perspective is being uncovered. This workshop will be a hands-on showing researchers and programmers how to answer their big questions with Wikidata. Its core points will be:

  1. What Wikidata is from a technical viewpoint:
    1. Native format
    2. Structure of a Wikidata Item
    3. Data Types
  2. Example Questions already answered:
    1. Which language Wikipedias are the most unique?
    2. Which language's Biography articles composition are most female?
    3. What is the name of every language in every language?
    4. What are the most popular book genres in each language Wikipedia?
    5. Denny's Coastline and Subway Maps
  3. How to use Pywikipedia to get the data live.
    1. Code walkthrough.
      1. Utilizing what links here along with
      2. new classes in the pywikibot library.
  4. How to use WDA to work with the data offline.
    1. WDA is a python script
      1. Downloads incremental dumps.
      2. At minimum you can do text parsing on a huge file
    2. More elegant solutions have already received funding.
      1. Demo of the Java library.


Length of presentation/talk (see Presentation Types for lengths of different presentation types)
75 Minutes

This workshop could be done in 60 minutes.

Will you attend WikiConference USA if your submission is not accepted?

Yes if I also receive travel scholarship.

Slides or further information (optional)

Have given a similar talk at Wikimedia Foundation Headquarters video here.

And blog posts [1], [2], [3], and [4].


Special request as to time of presentations


Interested attendees

If you are interested in attending this session, please sign with your username below. This will help reviewers to decide which sessions are of high interest. Sign with four tildes. (~~~~).

  1. Add your username here.

Timathom (talk) 10:44, 3 April 2014 (EDT)