Difference between revisions of "2016/Affiliate training sessions"

From WikiConference North America
Jump to navigation Jump to search
Line 1: Line 1:
   
   
  +
'''Program Design, Support, & Evaluation learning rotations'''
'''Affiliate training session'''
 
  +
==Learning rotation session will feature demonstrations of X tools and resources==
+
==Learning rotations session will feature two tables for small group overviews and demonstrations of 10 different tools and resources==
 
{{Draft}}
 
{{Draft}}
   
Line 8: Line 9:
 
{| class="wikitable"
 
{| class="wikitable"
 
!Time
 
!Time
  +
!Table A: Program Design & Evaluation Resources
!Table A
 
  +
!Table B: Online Metrics and Collaborative Project Management Tools
!Table B
 
 
|-
 
|-
|| 3:30 || Slot 1 ||Slot 2
+
|| 3:30 || What & Why of Program Evaluation|| Magic Button
 
|-
 
|-
|| 3:45 || Slot 3 ||Slot 4
+
|| 3:45 || Logic Models Overview ||Category Tools
 
|-
 
|-
|| 4:00 || Slot 5 ||Slot 6
+
|| 4:00 || SMART Targets Overview || Quarry
 
|-
 
|-
|| 4:15 || Slot 7 ||Slot 8
+
|| 4:15 || Grants Programs ||Education Toolkit
 
|-
 
|-
|| 4:30 || Slot 9 ||Slot 10
+
|| 4:30 || Outcome Mapping ||IdeaLab
 
|-
 
|-
|| 4:45 || Slot 11 ||Slot 12
+
|| 4:45 || Open Q&A || Phabricator (Tentative)
 
|}
 
|}
   
==Programs and Events Dashboard Overview==
 
;Session Abstract:This session will present the [[Mw:Wikipedia Education Program/Dashboard|Programs and Events Dashboard]] as a program management tool for Wikimedians interested in organizing, running and tracking groups of users in various contexts (education programs, writing contests, edit-a-thons and workshops, etc.).
 
   
  +
== Program Design & Evaluation Resources ==
'''Specific learning objectives''': Participants understand what the Program and Events Dashboard is and how it can be used in the future for their programs.
 
  +
===What is Program Evaluation===
*'''Amount of time required''': 15 minutes
 
  +
Session Abstract: Each year, volunteers and program leaders of the Wikimedia movement plan and execute a great diversity of program activities. But how do we know that our program activities are likely to achieve the results we hope to see?
  +
  +
'''Specific learning objectives''':
  +
* Participants learn what a program and program evaluation are and why we do different types of evaluation
  +
* Participants learn where to find learning resources and tools for program planning and evaluation
  +
  +
===SMART Targets===
  +
;Session Abstract: SMART is a framework for creating targets (or "measures of success" as it's known in PEG, or "objectives" as it's known in APG) for programmatic work; it's a framework that the Wikimedia Foundation grant teams use to help grantees turn generic objectives / measures of success into focused statements, that specify the action and expected outcomes. This session will focus on clarifying terminology (such as What is an "output" vs. "outcome") and reviewing the SMART framework used by the Wikimedia Foundation grant teams.
  +
  +
'''Specific learning objective''':
  +
* Participants understand SMART terminology and how to create SMART targets
  +
  +
===Logic Models===
  +
;Session Abstract: Logic Models are a very valuable tool for the planning of our program activities and for developing appropriate evaluation strategies. Based on the distinction between the ‘outputs’ and the ‘outcomes’ of our work, they can be used to carefully think through the links between what we are doing and what we want to change by this. In this rotation session we will shortly recap the concepts of outputs, outcomes and impact, practice to set up a simple logic model and discuss how this tool can be used and trained at the participant's’ home affiliations.
  +
  +
'''Specific learning objectives''':
  +
* Participants learn of basic logic model terms like outputs, outcomes, impact.
  +
* Participants get first ideas how to train and use logic models in their planning and evaluation work
  +
  +
** Outcome Mapping
  +
;Session Abstract: Outcome Mapping is an evaluation approach developed in 2001 for evaluation which places people as the center and looks at changes in behavior to define outcomes. Outcomes are defined as “changes in behavior, relationships, activities, or actions of the individuals, groups, and organizations with whom a program works directly” as well as indirectly with “those potential partners with whom the program anticipates opportunities for influence.” (Earl, Carden, & Smutylo, 2001, p.1)
  +
  +
In the Wikimedia world of programs, outcome mapping strategies can help us to gather data on the contributions that our programs make in terms of bringing about changes in our partners around the world that help to build toward greater engagement in open and free knowledge. Further, outcome mapping can also help us to evaluate both the intended as well as unintended results of our innovative Wikimedia programs. In this way, outcome mapping can help to get beyond the more direct program outcomes to deeper environmental outcomes to measure a program’s contribution to complex change processes.
  +
  +
* Because it allows for qualitative outcomes and stories of our projects and programs and helps to better surface our shared impact.
  +
* Because it provides useful framework for getting beyond - to measure those intermediate and longer-term qualitative outcomes that influence higher-level systems change in the socio-political environment.
  +
* Because our projects seek deeper changes and impact on the world than may be directly linked to our own direct project environments.
  +
  +
'''Specific learning objectives''':
  +
* Participants will learn of two core outcome mapping strategies to help identify the social influences related to their programs.
  +
* Participants will learn of next steps for learning more about outcome mapping in their evaluation work.
  +
  +
==Online Metric and Collaborative Project Management Tools==
   
 
==Global Metrics Magic Button==
 
==Global Metrics Magic Button==
 
;Session Abstract: Participants will learn how to use the new [[m:Grants:Learning patterns/How to use the Global Metrics Magic Button|Global Metrics magic button]], that collects four global metrics using one report with four inputs.
 
;Session Abstract: Participants will learn how to use the new [[m:Grants:Learning patterns/How to use the Global Metrics Magic Button|Global Metrics magic button]], that collects four global metrics using one report with four inputs.
   
'''Specific learning objectives''':
+
'''Specific learning objective''':
 
* Participants learn how to use the "magic button"
 
* Participants learn how to use the "magic button"
  +
  +
==Category Tools: What to use?==
  +
;Session Abstract: Will provide a quick rundown of the existing category assessment tools, and help solve project specific requests about how to use these tools:
  +
[https://meta.wikimedia.org/wiki/BaGLAMa BaGLAMa 2]
  +
[https://meta.wikimedia.org/wiki/GLAMorgan GLAMorgan]
  +
[https://meta.wikimedia.org/wiki/TreeViews TreeViews]
  +
[https://petscan.wmflabs.org/ PetScan] (successor to CatScan)
  +
Massview
  +
  +
'''Specific learning objective''':
  +
* Identify when and how to use different tools available for assessing content on the projects
  +
  +
===Quarry===
  +
;Session Abstract: Participants will learn what you can learn with Quarry.
  +
  +
'''Specific learning objectives''':
  +
* Participants will understand the variety of information available
  +
  +
===Education Toolkit===
  +
;Session Abstract:
  +
  +
''"Specific learning objective''':
 
*
 
*
   
  +
===IdeaLab Overview===
*'''Amount of time required''': 10 minutes
 
  +
*'''Potential participant-collaborators''':
 
  +
;Session Abstract: Starting a new project or proposal from scratch is no easy task. IdeaLab is space on Meta to draft your idea and get feedback on it before moving it to the next step of implementation. In this session, you’ll learn about how to use IdeaLab and be provided an overview of grants offered by the Wikimedia Foundation.
  +
  +
'''Specific learning objectives''':
  +
* Understand how to start an idea and leave feedback in IdeaLab.
  +
* Learn about what funding opportunities the Wikimedia Foundation offers through its grant programs and their relation to IdeaLab.
  +
* Learn different avenues by which you can move your ideas into implementation, whether they require funding or not.
   
== Phabricator ==
+
=== Phabricator (Tentative)===
 
; Session Abstract: How can people let developers know about the tech issues they're having and the features they'd like to suggest? How can you set priorities for complex projects for your team? ''Through Phabricator'' is the answer. Putting things in front of the people who can actually do something about them is easier than you think.
 
; Session Abstract: How can people let developers know about the tech issues they're having and the features they'd like to suggest? How can you set priorities for complex projects for your team? ''Through Phabricator'' is the answer. Putting things in front of the people who can actually do something about them is easier than you think.
   
  +
;'''Specific learning objectives''':
;Specific learning objectives: Participants understand what [https://www.mediawiki.org/wiki/Special:MyLanguage/Phabricator Phabricator] is and how to file a basic task or to find one.
+
* Participants understand what [https://www.mediawiki.org/wiki/Special:MyLanguage/Phabricator Phabricator] is and how to file a basic task or to find one.
*'''Amount of time required''': 10-20 minutes
 
   
   

Revision as of 00:10, 5 October 2016


Program Design, Support, & Evaluation learning rotations

Learning rotations session will feature two tables for small group overviews and demonstrations of 10 different tools and resources

This article is a work in progress. Information on this page may not be current or accurate.

Schedule of Rotations

Time Table A: Program Design & Evaluation Resources Table B: Online Metrics and Collaborative Project Management Tools
3:30 What & Why of Program Evaluation Magic Button
3:45 Logic Models Overview Category Tools
4:00 SMART Targets Overview Quarry
4:15 Grants Programs Education Toolkit
4:30 Outcome Mapping IdeaLab
4:45 Open Q&A Phabricator (Tentative)


Program Design & Evaluation Resources

What is Program Evaluation

Session Abstract: Each year, volunteers and program leaders of the Wikimedia movement plan and execute a great diversity of program activities. But how do we know that our program activities are likely to achieve the results we hope to see?

Specific learning objectives:

  • Participants learn what a program and program evaluation are and why we do different types of evaluation
  • Participants learn where to find learning resources and tools for program planning and evaluation

SMART Targets

Session Abstract
SMART is a framework for creating targets (or "measures of success" as it's known in PEG, or "objectives" as it's known in APG) for programmatic work; it's a framework that the Wikimedia Foundation grant teams use to help grantees turn generic objectives / measures of success into focused statements, that specify the action and expected outcomes. This session will focus on clarifying terminology (such as What is an "output" vs. "outcome") and reviewing the SMART framework used by the Wikimedia Foundation grant teams.

Specific learning objective:

  • Participants understand SMART terminology and how to create SMART targets

Logic Models

Session Abstract
Logic Models are a very valuable tool for the planning of our program activities and for developing appropriate evaluation strategies. Based on the distinction between the ‘outputs’ and the ‘outcomes’ of our work, they can be used to carefully think through the links between what we are doing and what we want to change by this. In this rotation session we will shortly recap the concepts of outputs, outcomes and impact, practice to set up a simple logic model and discuss how this tool can be used and trained at the participant's’ home affiliations.

Specific learning objectives:

  • Participants learn of basic logic model terms like outputs, outcomes, impact.
  • Participants get first ideas how to train and use logic models in their planning and evaluation work
    • Outcome Mapping
Session Abstract
Outcome Mapping is an evaluation approach developed in 2001 for evaluation which places people as the center and looks at changes in behavior to define outcomes. Outcomes are defined as “changes in behavior, relationships, activities, or actions of the individuals, groups, and organizations with whom a program works directly” as well as indirectly with “those potential partners with whom the program anticipates opportunities for influence.” (Earl, Carden, & Smutylo, 2001, p.1)

In the Wikimedia world of programs, outcome mapping strategies can help us to gather data on the contributions that our programs make in terms of bringing about changes in our partners around the world that help to build toward greater engagement in open and free knowledge. Further, outcome mapping can also help us to evaluate both the intended as well as unintended results of our innovative Wikimedia programs. In this way, outcome mapping can help to get beyond the more direct program outcomes to deeper environmental outcomes to measure a program’s contribution to complex change processes.

  • Because it allows for qualitative outcomes and stories of our projects and programs and helps to better surface our shared impact.
  • Because it provides useful framework for getting beyond - to measure those intermediate and longer-term qualitative outcomes that influence higher-level systems change in the socio-political environment.
  • Because our projects seek deeper changes and impact on the world than may be directly linked to our own direct project environments.

Specific learning objectives:

  • Participants will learn of two core outcome mapping strategies to help identify the social influences related to their programs.
  • Participants will learn of next steps for learning more about outcome mapping in their evaluation work.

Online Metric and Collaborative Project Management Tools

Global Metrics Magic Button

Session Abstract
Participants will learn how to use the new Global Metrics magic button, that collects four global metrics using one report with four inputs.

Specific learning objective:

  • Participants learn how to use the "magic button"

Category Tools: What to use?

Session Abstract
Will provide a quick rundown of the existing category assessment tools, and help solve project specific requests about how to use these tools:

BaGLAMa 2 GLAMorgan TreeViews PetScan (successor to CatScan) Massview

Specific learning objective:

  • Identify when and how to use different tools available for assessing content on the projects

Quarry

Session Abstract
Participants will learn what you can learn with Quarry.

Specific learning objectives:

  • Participants will understand the variety of information available

Education Toolkit

Session Abstract

"Specific learning objective':

IdeaLab Overview

Session Abstract
Starting a new project or proposal from scratch is no easy task. IdeaLab is space on Meta to draft your idea and get feedback on it before moving it to the next step of implementation. In this session, you’ll learn about how to use IdeaLab and be provided an overview of grants offered by the Wikimedia Foundation.

Specific learning objectives:

  • Understand how to start an idea and leave feedback in IdeaLab.
  • Learn about what funding opportunities the Wikimedia Foundation offers through its grant programs and their relation to IdeaLab.
  • Learn different avenues by which you can move your ideas into implementation, whether they require funding or not.

Phabricator (Tentative)

Session Abstract
How can people let developers know about the tech issues they're having and the features they'd like to suggest? How can you set priorities for complex projects for your team? Through Phabricator is the answer. Putting things in front of the people who can actually do something about them is easier than you think.
Specific learning objectives
  • Participants understand what Phabricator is and how to file a basic task or to find one.


Sign ups

  1. Rosiestep (talk) 19:15, 3 October 2016 (EDT)