User:JMatazzoni (WMF)/drafts

The third most popular wish of the 2017 Community Wishlist Survey asked for a “next generation tool” to quickly and easily provide the metrics that event organizers need to help measure their projects and demonstrate the impact of their work. This page tracks the development of that tool, which we’ve named Event Metrics.

Planned improvements

edit

Event Metrics is based on—and will replace—the existing Grant Metrics tool. Below are a list of the main new features we plan to add (unlisted are the many interface and back-end upgrades that will make the system more friendly and capable generally). These improvements will give users many new types of data, in formats they didn't have before, and enable whole new types of users to define and track events. (As always, these plans are subject to change in the event we encounter unforeseen difficulties—last updated Dec. 21, 2018; check Updates for developments).

New downloadable reports (CSV and Wikitext)
  • Event Summary provides totals for pages created and improved, files uploaded, Wikidata items created and improved  and other contributions as well as impact metrics like pageview for these. More info.
  • Pages Created lists all new articles along with their creators, accumulated pageviews, etc. More info
  • Pages Improved lists all pages that were edited along with their editors, avg. pageviews per day, etc. More info.
Filtering improvements
  • Category filter upgrades: The Category filter lets organizers restrict metrics to articles in particular categories. We're working on improvements that will make Category filtering more flexible and useful:
    • Talk-page Category filtering: To make this filter work with the specialized talk-page categories organizers often use, we are adding the ability for users to be able to include talk-page categories in searches; the system will behave as though the talk-page categories were applied to the associated main pages (only main-page changes will be counted). Read more.
    • Category filtering of uploaded files: To make the filter useful for tracking images, videos and other types of uploaded files, we’re also adding the ability—on Commons only—to  filter files by categories. Read more.
  • Independent Category or Participant filtering: Currently, all events must include a Participants list. This can be a problem for people who organize events that don’t include formal sign-up. So we’re eliminating that requirement and allowing you the flexibility of filtering by categories, participants or a combination of both (you are required to pick one of these options). Read more.

Status and schedule

edit

Here are highlights of the new features planned, presented more or less in order of development (as of 19 Feb. 2019). Links shown are to the main Phabricator tasks for each feature.

New downloadable reports

Event filtering improvements

These are only the main deliverables; for those interested, here is an overview of the full task list.  

Follow this project—and tell us what you think

edit

Project updates

edit

The new Event Metrics is LIVE (in beta)—please try it (March. 31, 2019)

edit

The Community Tech team has launched the big upgrade to Event Metrics I’ve been talking about on these pages. With this release, event organizers get many new metrics and powerful new features for measuring event participation, contributions and impact.

Please try it; all you need is a wiki login. If you have events already in the program, even from when it was called Grant Metrics, they'll still be there. You need only click “Update data” to get new metrics and reports for these existing events.

The sooner people give Event Metrics a try the better. Right now, CommTech is still on active assignment to address Event Metrics problems. We’re listening and ready. It’s too late to add new features (though we want to hear your ideas), but we can make adjustments to things that are confusing and can fix things that aren’t working. (We’ll always fix real “bugs” to the program, of course, but right now the definition of what counts as as a problem is broader than it will be once we’ve moved on in a few weeks and shifted our focus to the next group of users’ Wishlist wish.)

Consider the program in beta right now; we tested it but are sure to have missed things. So please have a look and tell us what's broken, what's confusing and what you love! Here’s the talk page where people will be reporting issues from now on.

New designs for data display and filtering interface (March. 26, 2019)

edit
 
Summary data: the Event summary page presents its many new metrics in an easy-to-read format.

I’m writing to share designs for the new interface the team is working on for one of the main Event Metrics pages—and in particular for the new filtering interface that will be part of the page.

New  ‘Summary' data display

edit

All of the mockups here show the redesigned Event Summary page. This is arguably the main page of the tool, where you can see top-line metrics about your event, download reports and add filtering. The image at left shows how the page presents all the new metrics the system produces. For easier comprehension, the numbers are organized into sections: “Contributions,” “Impact” and “Participation.” (Don't worry if you don’t understand all the metrics here; in practice, the little gray “info” icons next to each figure will provide definitions.)

 
'Filters required': a red error box explains what the user must do to produce metrics.

New filtering interface helps you get set up

edit

One of the main differences between Event Metrics and the former Grant Metrics tool is that Event Metrics doesn’t require users to  filter contributions by a list of participants' usernames ("Participant" filtering). Instead, organizers can chose between filtering by Participants or by the new Categories filter—or they can combine the two to focus their event even more narrowly. Filtering is required, partly for performance reasons—we don't have the capacity to report on everything that happened on the wikis during your event. And the different filtering types work differently in some ways (e.g., Participant filtering applies to all wikis, while Categories are specific to individual wikis).

This two facts points to a challenge we had in designing the system: with the new filtering capabilities comes a certain amount of complexity. To use the system effectively, users will need to understand a bit about how it works. To help you with that, the new filtering interface provides context-sensitive help. Here are a couple of examples of how it works.

 
'Event partially configured': the user has added filtering for some but not all wikis.

“Filters required”

The mockup at right shows the Event Summary page as it will appear just after a user has created an event but before applying any filters. The red box in the image announces that the user needs to add filtering and explains the requirements. This information is reinforced by messages and red icons on the filtering panel headers.

“Event partially configured”

In the mockup at left, the user has partially configured the event. The system shows metrics for some wikis but not all (I’ve cut off the page for space reasons). The yellow box tells the user which wikis lack filtering and explains what the user needs to do to fully configure the event and get metrics for all wikis.

What do you think?

edit

Please have a look at the mockups. Do you like the new data display? Does the context-sensitive help seem effective and clear? Do you have questions? Let us know what you think; we're listening.


Revised project plan and priorities (Feb. 20, 2019)

edit

As the team has worked on and learned more about Event Metrics, we’ve had to revise the schedule and feature list somewhat to cut scope for what has turned out to be a challenging project. The list below shows the revised list of main deliverables we're working toward and the rough order of feature releases, which you can also find in the Status and schedule section of this page.

New downloadable reports

Event filtering improvements

As you can see, the first push is to deliver new reports and lots of new data—particularly in the area of “impact” metrics, which measure the size of the audience event contributions garner.

Next, we’ll make the Category filter significantly more flexible and useful. In particular, we’ll make it independent of Participant filtering, which is currently required for all events. That will make Event Metrics useful for events that don’t involve a formal sign-up process. We’ll also add the ability to use Category filtering for files on Commons—in order to better serve image drives and similar events (Category filtering is currently works only for pages). Finally, the ability to filter by talk-page categories will help WikiProjects and others who classify pages and files by putting categories on’ talk pages. (The system will behave as though the talk-page categories were applied to the associated main pages—counting only the main-page changes.)

Some of you will notice that a feature we'd been hoping to build, Worklist filter, is no longer on our to-do list. The reason for this, as I alluded to above, is that Community Tech has a limited amount of time we can spend on each project, and integrating Worklist with the existing Event Metrics reports and filters would be a complex job. It would also require creation of a completely new user interface for event setup, so as to clarify a series of confusing interactions among filters, wikis and contribution types. Category filter, meanwhile, exists already, and with the improvements listed above we hope it will work for may events to achieve some of the same goals.

Please let us know what you think about that, and give any other questions or comments about this plan. What could we do to make it better? Which features are you most excited about? Are there any you would skip? We’re listening!

Details on metrics for our first new reports (Feb. 12, 2019)

edit

The team is making good progress toward creating the two new Event Metrics reports we’re planning for our first release. Our plans are firm enough now that I’ve published a Help page detailing all of the metrics we expect these reports to include: Event Metrics/Definition of metrics.  

As this page makes clear, the new reports provide many numbers that weren’t available in the Event Metrics predecessor, Grant Metrics. In particular, so-called “impact" metrics—like “Views to pages created” or “Avg. daily views to files uploaded”—will now let event organizers and partners gauge the size of the audiences their contributions are garnering.

The Definition of metrics page entries spell out how each metric is calculated and exactly what it does and doesn’t include. I know organizers will be interested in these specifics, so I hope you’ll take a few minutes to look these definitions over and offer your thoughts. What are you excited to see? What is unclear or doesn’t seem right? (I’ve set up a section on the project talk page for your comments on this.) We’re listening!

Oct. 26: Metrics changed, delayed or dropped

edit

Last month I proposed a list of specific figures we were planning to include in a series of new Event Metrics reports. As we’ve started working on the first two of those reports—Event Summary and Pages Created—we’ve learned more about what we can and can’t get from our databases at scale. In response, our plans have evolved and adapted. In this post, I lay out the changes we think make sense, so that you can please respond and offer your advice. (To get the specifics and find out what is in those two reports under construction, check Event Summary and Pages Created tickets.)

Metrics changed or limited

edit

The following are all metrics we plan to include, but each has changed in some way compared to what I’d originally imagined.

  • ‘Pageviews to files uploaded’ This is likely going to change to become “Average. daily views to files uploaded”—which is to say, a per-day figure instead of a cumulative one. The reason: while we can easily figure out what pages a given file has been added to, there is no easy way to know the date when the image was added to a page. Without that date, it’s impossible to accurately figure out the total pageviews accrued. So here’s the workaround we’re proposing: 1) determine the pages the file is on at present, 2) look at the past 30 days of traffic to those pages (in order to smooth out normal daily fluctuations), 3) divide by 30 to express that total as a daily average (learn more about this here). Does that sound like a useful approach? How important for your work is this metric?
  • ‘Namespace’ Grant Metrics (the program we’re transforming into Event Metrics) currently tracks only Main namespace for things like pages created, edits made, etc. (It looks at other namespaces when necessary for specific metrics, like “Files uploaded.”) We’d thought about giving organizers filters to let them selectively track a few other namespaces. But that flexibility adds too much complexity to the project (for a first version, anyway). Also, people should know that each namespace we search essentially doubles the system load and wait times. And we're already pretty concerned about both of those. On the other hand, the Draft space is pretty heavily used by new users—particularly on English Wikipedia, where newbies aren't allowed to create articles in Main. So, as an experiment, we're planning to include Main + Draft namespaces (for the wikis that have Draft). If wait times soar, we may have to back that out again. On the other hand, if the experiment goes well, we can consider adding another space. (A side note: for searches that use the Category filter—coming soon!—we’ll apply Talk page categories to Main namespace pages. This will make the Category filter work for Wikiproject members and others who, by convention, apply their special-purpose categories to Talk instead of Main pages. Read more on this.)  Does this experiment sound valuable? Are there other namespaces you would want to include? If you think so, please tell us why that namespace is very important for your work.
  • ‘Still exists?’ & ‘New page survival rate’ On the new “Pages created” report, the “Still exists?” metric tells you whether a given article has been deleted. On the “Event summary” report, the “New page survival rate” is an aggregate figure that tells what percentage of created pages remain. We can get this information from the Archive log—but there’s a catch. The log doesn’t record categories. What that means is that  if you use the soon-to-be-released Category filter to define your event, you won’t get these survival metrics. Again, a better but more complex way to get this information exists. It requires us to store a record of all pages created during the event, which would provide a basis for comparison going forward. That’s a big change to how our system works, and it won’t come soon.  So in the meanwhile, on the theory that data about deleted pages is important for organizers, we’re planning to offer these metrics in the partial form just described. You can read more about this in the ticket.  Does that interim solution sound reasonable? How important to you is it to know which pages were deleted?
  • ‘% women (estimated)’ Organizers have asked for a way to track participation by women, who are underrepresented in wiki work. But there’s no reliable way to derive users’ gender from their wiki contributions. So this metric—planned for the Event Summary report—is unique in being something that organizers will have to enter themselves, based on their own headcount or survey. I asked organizers about this last week during a presentation at WikiConference North America, and their response was that yes, we should include the metric. But they said it shouldn’t ask just about women vs. men. So the new plan is to give organizers the option to record “# of Women”, “# of Men” and “# of Other” for any event.  If organizers fill in those fields, they’ll see the numbers reflected back as percentages in both event and program reports. Here’s the ticket where this is described. Your thoughts?

Metrics dropped or delayed

edit

The following metrics were proposed but have proved more challenging than expected. At present, we have no plans to develop any of them in the near term. Your input could change those plans. But know that resources for this project are limited: if we take on any of these complex metrics, other features will have to drop out.  

  • ‘Words added’ This metric would provide a very handy measurement for the amount of work done during an event. Organizers I spoke with at WikiConference North America said this is something they would like to have, even if the figure were inaccurate (as it probably would be). But calculating words added at scale across multiple scripts and hundreds of languages  is a very big challenge. Recognizing word limits is actually a non-trivial engineering problem. It might be somewhat obvious in English, but is definitely not in other languages where “space” (and dot, and dash, etc) are not the real indicators of word separation. On top of that, we’d need, for every given page, to look at the content and analyze (“parse”) it, which would add a potentially significant load to the rest of our already heavy querying methods. We’ve created a ticket to investigate the issue; please add your suggestions if you have ideas.
  • 'Description' This was meant to be a short description of a file or article—taken from the first sentence or from wikidata, etc. The idea was that it would make reports like “Images uploaded” or “Pages created” more understandable to bosses, partners and others who might read these lists of sometimes obscure file or article names. It could be done, but would require us to parse the wikitext of pages, which is hard and resource intensive. Let us know if this seems like an important omission.
  • 'Wikidata claims added' A “claim” is a statement + an identifier—basically a discrete fact in Wikidata. We will report on Wikidata items created—entirely new entities. But, once again, this data is not something we can simply look up. Tracking the claims within items would require the system, again, to parse the wikitext itself, which is resource intensive at scale. Reactions?
  • Bytes, edits, words ‘changed subsequently’ These three figures were meant to track the work that happens to an article “subsequently” to the conclusion of the event. They were proposed as a way to index community interest in a given article, as measured by people continuing to work on it. But these figures are subject to the same limitation described above under “‘Still exists?’ & ‘New page survival rate’”: to do them properly, we’d need to make a record of all pages created during the event, to use as a basis for comparison going forward. We want to do that, and these figures may become possible in the future.  If these figures would be important to your work, please explain why.

Sept. 29, 2018: Which new features should we release first?

edit

We’ve started writing up tasks for the program of improvements to Event Tool I proposed last week. This leads to an important question: in what order should we build and release these new features? I’m writing today to lay out my thinking on that—and to see if you agree: share comments here.

Broadly speaking, the project involves three types of features:

  1. Six new downloadable reports
  2. A series of new filters that let you customize those reports
  3. Expanded and redesigned onscreen data displays —plus some other features if we can get to them

Three stages

edit
 
This wireframe mockup for the 'Event Summary' data screen shows a greatly expanded set of event metrics, along with a new layout that organizes data into sections for easier comprehension. A breakout table at bottom gives per-wiki results. At upper-right, the user downloads one of six new metrics reports. (Comment on this wireframe.)

Releasing these features in the general order listed above seems like the quickest way to bring the most value to users. This follows from the observation that the two main use cases for event metrics seem to be 1) to post them on wiki, where the participants and others can see them, and 2) to report outcomes to bosses, partners, grantors, etc., possibly in depth.

  • Stage 1, new downloadable reports: Building the downloadable reports first will help organizers with both of the use cases mentioned above. Users will get a lot of data not previously available from Grant Metrics in a choice of two formats: wikitext or spreadsheet. We can release the reports one by one as they’re ready.
  • Stage 2, filtering tools: In stage 2, we'll add the new filters and filter controls. The ability to limit metrics to a “Worklist” of articles (to improve or create) was highly requested, so it seems right to build that filter first. Then we can wade into the tricky business of providing controls that will let users turn different filters on or off, to tailor reports for different purposes or audiences. (A filter that enables organizers to limit results by categories was part of the original Grant Metrics project and is already in the works.)
  • Stage 3, onscreen data displays: In stage 3 we'll roll out onscreen displays with lots more data than currently and a new design. These metrics screens provide organizers with a read on event outcomes that is relatively detailed yet quicker to access than downloadable reports. (The wireframe mockup at right shows all the new data we'll be adding and a new design—what do you think?)  If we still have time by this stage in the process, we'll try to get to some other features that have been popular with organizers.  

Follow this project—and tell us what you think

edit

Let us know if the approach described above makes sense to you, or what you’d change.

  • To follow this project, go to the page for the Event Tools tag in our task-management tool, Phabrictor, and become a “Member”.
  • To see the the tasks for this project that we’re about to start working on, follow this link (which filters the Community Tech backlog board by the “Event Tools” tag). The tasks slated to move forward are listed in the column “To be estimated.”

Sept. 19, 2018: Proposing a product plan for ‘Event Tool’

edit

A month ago I posted on the talk page to ask what type of metrics and metrics features event organizers want. After analyzing the lengthy discussion that ensued, I proposed a set of metrics features and new metrics reports for what I’ve been calling Event Tool. Organizers responded again with comments and requests, and I  added some additional features (e.g., the ability to track audio and video plays and to get a gender breakdown of events/programs).

Today, I’m recommending  that we move forward with the program of work described in the two metrics posts mentioned above, making these posts (here and here) essentially the blueprint for this round of Event Tool development. The goal will be to turn Grant Metrics into a simple but capable tool that will meet the metics needs of most event organizers—a “one-stop shop,” as one organizer called it, for getting the data that many organizers now laboriously assemble from various sources.  

We need your feedback

I’ve called this metrics proposal “ambitious but achievable,” and I think that’s a good description. We can’t promise to deliver every single feature described, but the posts referenced above outline the product direction in detail. Now it’s your turn again: does this look about right? Is there anything we can add that would make it work better for you? And—since it’s possible we won’t get to everything—are there any features here you think we could drop or put on our low-priority list? Please leave your comments in the talk page section provided.

Why this direction?

edit

The Wishlist wish that kicked this project off, and which 111 people voted for, is completely focused on metrics, so new data and reporting features were a given from the start. Plus my research with organizers made it clear that metrics is one of the areas where they need significant help. Many other possibilities were considered, however.  

As part of the product-design process, I began by examining all stages of wiki event workflow, looking for pain points and framing possible solutions. Organizers expressed support for many of those ideas—particularly automated wiki page creation, event signup, and assistance with wiki account creation. But as the team investigated these event-management ideas, a few things became clear: accommodating the wide variety of organizers’ needs would make these tools more complex than they at first appeared. And many of these features don’t produce user value until the whole system is built and can work together. In some cases, the desired features are otherwise dependent on some other project. As just one example, some of the wiki account-creation improvements discussed require first that we build an event signup system. Event signup, in turn,  requires us to collect email, so for security reasons it can’t be built unless we move Grant Metrics from its current location, on the Tool Forge servers, onto our production servers.

Such complexities made event management features a big job. Meanwhile, in terms of project size, the proposed metrics features are already at the very high end of what’s allowed for a Wishlist undertaking.

Which is not to say that the event-management tools explored will never be built. The WMF’s annual plan this year recognizes organizers as “fundamental implementers” of the free-knowledge movement and  a "core asset.” The plan authorizes a Movement Organizer Study, scheduled for this winter, to better understand organizers’ needs and create a framework for “making strategic investments” to support them. Those future investments may well take up some of the Event Tool work discussed in these pages. (Organizers are also welcome to make a second Wishlist proposal.)

Next Steps

edit

First, we want to hear from you. Do you agree with the proposed direction? What is missing? What is unnecessary?

If it looks like organizers agree with the plan, then I’ll begin prioritizing and specifying the particulars for these new functions. We’ll also begin work on designs, and I’ll start showing those for comment as soon as I can. Since we’ll be expanding the  mission of Grant Metrics, it will probably be renamed: “Event Metrics” is the favorite at the moment.

One of the many advantages of the fact that Grant Metrics is an existing tool is that we'll be able to add features to it one by one, instead of having to wait to get a whole system up and running. So if organizers like the general plan, the team should be able to start rolling out new features in the next few months.


Reaching out again to people who voted for this Wishlist item:

edit

Because you voted for the Wishlist wish that inspired this project, we thought you’d want to know that it’s about to move into the development phase. The post above proposes the outlines for a metrics tool aimed at event organizers.  If that sounds like you, we’re eager for your input. Please read the posts and watch this page if you have a continuing interest.

@Paucabot:, @FULBERT:,  @Rhododendrites:, @JaxieDad:, @PointsofNoReturn:, @Annamariandrea:, @Halibutt:, @Masssly:, @Voltaireloving:, @Sarahobender:, @Davidpar:, @HugoHelp:, @Vallue:, @Medol:, @Tiputini:, @Monikasj:, @Stinglehammer:, @Caorongjin:, @John Cummings:, @Fixer88:, @Spiritia:, @Epìdosis:, @LornaMCampbell:, @Mtmlan84:, @Ouvrard:, @Mauricio V. Genta:, @Anthere:, @Kvardek du:, @Jorid Martinsen (WMNO):

@African Hope:, @Astrid Carlsen (WMNO):, @Richard Nevell (WMUK):, @Alangi Derick:, @Talueses:, @Wittylama:, @VMasrour (WMF):, @TrMendenhall:, @Jmatazzoni:, @Ecritures:, @MichaelMaggs:,  @Pharos:, @Heathart:, @John Andersson (WMSE):, @Haxpett:, @Psychoslave:, @Anne-LaureM:, @Sylvain WMFr:, @Yohannvt:, @Lirazelf:, @ArtEmJJ:, @Jack who built the house:, @Chicovenancio:, @جواد:, @Xavier Dengra:, @לסטר:, @Esh77:, @Bijay chaurasia:, @Shangkuanlc:

@Joalpe:, @Ijon:, @Krishna Chaitanya Velaga:, @Discott:, @David1010:, @Liuxinyu970226:, @Jc86035:, @Megs:, @Thomas Obermair 4:, @Shizhao:, @Pbsouthwood:, @Bspf:, @Donald Trung:, @Wikinade:, @Nick Moyes:, @Zhangj1079:, @OrsolyaVirág:, @B20180:, @Exilexi:, @Rachel Helps (BYU):, @Mickey83:, @Becksguy:, @VIGNERON:, @Ozzie10aaaa:, @Superchilum:JMatazzoni (WMF) (talk) 22:16, 3 October 2018 (UTC)

@Romaine:, @Theklan:, @JenOttawa:,  @:, @Ckoerner:, @Arian:, @Kerry Raymond:, @Doc James:, @Muhraz:, @Hexatekin:, @Pamputt:, @Satdeep Gill:,  @NickK:, @Townie:, @Martin Urbanec:, @Vachovec1:, @Battleofalma:, @YjM:, @Theredproject:, @Mckensiemack:, @Hasive:, @Nattes à chat:,  @Winged Blades of Godric:, @Meredithdrum:, @Artchivist1:, @Quiddity:JMatazzoni (WMF) (talk) 22:21, 3 October 2018 (UTC)

Sept. 6, 2018: Proposed metrics features—what do you think?

edit

A “next-generation toolset” for producing more and better event metrics is the main requirement of the Wishlist proposal that launched this effort. Toward that goal, we asked what type of metrics and metrics features event organizers want, and received a lot of good ideas. I’ve studied that input carefully and worked through the various suggestions with our engineers to judge the level of effort involved in each. We can’t do everything; some of the features proposed would push this project beyond the scope of what a Wishlist wish can take on. But based on your suggestions, this post lays out a vision for an ambitious yet achievable reporting tool based on Grant Metrics (which we may need to rename to reflect its new, broader mission).

Below you’ll find 1) descriptions of the new features we are proposing, followed by  2) a diagram comparing existing vs. proposed Grant Metrics reports and 3) a detailed breakdown of all the new data we’re adding (look for the label “NEW”).

We need your feedback

Event organizers are a diverse group with varied needs. The first release of the tool described below is designed to provide the metrics that typical event organizers need while remaining relatively simple and easy to use. It probably won’t work for everyone—will it work for you? What could we do to make it work? Please let us know: I’ve posted a series of questions on the talk page to help structure the discussion there.

Lots of new data and new ways to get it

edit
  • Impact metrics: People asked for figures to help demonstrate the impact of their efforts,  so we’ve added new reports detailing Pages Created, Pages Improved, Commons Uploads and Wikidata Contributions, with new figures for # of pageviews, # of articles in which images were placed and much more.
  • Six new downloadable reports (in two formats): We’re proposing to add six new reports that you can download either as a spreadsheet file (.csv) or wikitext (see the diagram and descriptions below). The wikitext versions provide a quick solution for posting figures to wikis. The thinking is that they should offer less data than the spreadsheet files, on the theory that no one wants to post or edit a giant wiki table (does that sound right?).
  • More informative on-screen metrics: In addition to the downloadable reports, we’re adding around a dozen new metrics to the program-level and event-level screen displays.
  • Ongoing, cumulative metrics: By their nature, impact metrics (e.g., pages views to articles created) continue to build over time. The system will enable you to track such figures indefinitely, though you'll need to trigger updates manually, as described below. (Statistics relating directly to the event—e.g., # of participants or articles created—will not change once the event period you've defined ends.)
  • New descriptive information: To make reports more understandable for managers, partners and other non-specialists, they’ll include more explanatory information, such article or image descriptions, information about the type of event and the names of event partners.

Give feedback here about the ideas in this section

New tools for customizing reports

edit

A new set of filters will enable you to output different metrics for different purposes or audiences. You’ll be able to turn each of the following options on or off.

  • Limit metrics to “Worklist” articles only:  Restrict reports to a Worklist” of articles to be improved or created that you enter by copying and pasting (as a batch, not one by one).
  • Limit to specific categories:  As an alternative or addition to the above, you'll be able to restrict metrics to articles in a particular category or set of categories—including categories you define specifically for this purpose (task already in progress).
  • Limit to participants list: Limit metrics to contributions from participants that you enter by copying and pasting a list of usernames. Grant Metrics does this by default now; we'll make it optional.
  • Limit by wiki: For events that encompass multiple wikis, this option will enable organizers to focus reports on just one.
  • Specify desired namespaces: Grant Metrics currently counts contributions to Main namespace only. Main will remain the default, but we’ll let you pick your preference.

Give feedback here about the ideas in this section.

Assumptions, limitations, ideas for future releases

edit
  • Real-time (though not automatic) metrics: Grant Metrics provides the ability to get metrics on demand—during an event, for example. The organizer must trigger the update by pressing an Update button (i.e., updating does not run on a schedule).  
  • Auto-updating of wiki tables: As described above, the system will output reports as wikitext tables for easy posting. We could add the ability for on-wiki tables to update automatically when you update your figures in the tool (see the preceding item). If this is something people want, it would not be overly difficult to build.
  • Focused on metrics we can uniquely provide: Some users suggested tracking metrics from Facebook and other social media channels used for event promotion.  While this sounds useful, the ever-changing variety of popular services, our general lack of experience interfacing with them, and the complex privacy issues involved puts this out of scope for now.
  • Track citations added: This was a popular request and for good reason, since the presence of citations are a good indicator of how substantial and durable contributions are liable to be. But unlike the other metrics listed below, which can be gathered by examining metadata, identifying citations would require the system to parse the actual wikitext or HTML content of pages, which would be difficult and resource intensive.
  • Login required:  Only people with your login will be able to see your metrics. One organizer expressed a desire to be able to send links to pages in the system as a way of distributing reports. It makes sense, but the permissions-management issues involved put this idea out of reach for now. (It will be easy, however, to post the metrics on wiki, as described above.)
  • Useful on mobile: Grant Metrics works on mobile, but needs improvements to work well. (If mobile event metrics are important for you, please speak up.)

Give feedback here about the ideas in this section .

Sept. 6, 2018 (continued): New data and reports in detail

edit
 
A diagram showing the hierarchical structure of Grant Metrics reporting as it is now and how the new downloadable reports will fit in. New reports are shown in blue.

The post above describes the new reporting features proposed for Grant Metrics at a high level. This post outlines in detail all the new data types and reports we're proposing. Look below for the label “NEW.”

Multiple Programs-level reports

edit

Metrics for all an organizer’s “programs” (each program contains multiple events).  

All My Programs — on-screen report

edit

Existing metrics per program

  • List of all my programs, with links to each
  • # of events in each program
  • # of participants in each program.
  • # of new editors
  • # of pages created
  • # of pages Improved
  • # new editors retained after 7-days

Proposed metrics per program  (NEW)

  • # of edits made
  • # of bytes added to pages  
  • # of bytes removed from pages  
  • # of words added
  • # of Commons uploads  
  • # of Wikidata items created
  • # of Wikidata claims created (claim=statement + identifier)
  • # of views to all pages created (cumulative, since the event)
  • # of views to all pages edited (cumulative, since the event)
  • # of views to all images uploaded (cumulative, since upload)
  • # of plays to uploaded video and audio files (cumulative, since the event)
  • # of pages in which uploaded images are placed
  • # of pages created that were subsequently deleted
  • Gender breakdown of all programs [must be manually entered per event by organizer]

Give feedback here about these reports and the data in them.

Single Program-level reports

edit

Metrics for each event in a single program

Single Program — on-screen report

edit

Existing metrics per event

  • List of all events in program, with link to each
  • # of participants
  • # of  new editors
  • # editors retained after 7-days
  • # of pages created
  • #  of pages Improved

Proposed metrics per event  (NEW)

  • Length of the event (days / hours)
  • Event type (editathon, content drive, training session, etc.)
  • # of edits made
  • # of bytes added to pages  
  • # of bytes removed from pages  
  • # of words added
  • # of Commons uploads  
  • # of Wikidata items created
  • # of Wikidata claims created (claim=statement + identifier)
  • # of views to all pages created (cumulative, since the event)
  • # of views to all pages edited (cumulative, since the event)
  • # of views to all images uploaded (cumulative, since upload)
  • # of pages in which uploaded images are placed (cumulative, since upload)
  • # of plays to uploaded video and audio files (cumulative, since the event)
  • # of mainspace pages created that were deleted (after the event)
  • Gender breakdown of program [must be manually entered by organizer per event]

Single Program — CSV & Wikitext downloadable reports (NEW)

edit
  • Spreadsheet (.csv) and wikitext files of everything above plus
  • Extended retention figures (e.g., 1 mo, 2 mo, 6 mos, 1 yr)

Give feedback here about these reports and the data in them.

Single Event-level reports

edit

Metrics for activities of a single event

Single Event summary — on-screen report

edit

Existing information

  • Event name
  • Wikis involved
  • Start time/date (with timezone)
  • End time/date
  • Date/time of last data update
  • # of participants
  • # of  new editors
  • # editors retained after 7-days
  • # of pages created
  • #  of pages Improved
  • List of all participant usernames

Proposed information (NEW)

  • Event type (editathon, content drive, training session, etc.)
  • Event partners (GLAMs etc.)
  • Event venue
  • Short description
  • Length of the event (in days / hours)
  • # of edits made
  • # of bytes added to pages
  • # of bytes removed from pages  
  • # of words added
  • # of Commons uploads  
  • # of Wikidata items created
  • # of Wikidata claims created (claim=statement + identifier)
  • # of views to all pages created (cumulative, since the event)
  • # of views to all pages edited (cumulative, since the event)
  • # of views to all images uploaded (cumulative, since upload)
  • # of pages in which uploaded images are placed
  • # of plays to uploaded video and audio files (cumulative, since the event)
  • # of mainspace pages created that were deleted (after the event)
  • Gender breakdown of event [must be manually entered by organizer]

Single Event summary — CSV & Wikitext downloadable reports (NEW)

edit
  • Spreadsheet (.csv) and wikitext files of everything above plus
  • Extended retention figures (e.g., 1 mo, 2 mo, 6 mos, 1 yr)

Give feedback here about these reports and the data in them.

Pages Created  — downloadable reports  (NEW)

edit

CSV

  • Page name
  • Page URL
  • Short description/first sentence
  • Username of creator
  • Wiki  
  • Namespace
  • # of edits to the page during event
  • # of edits to the page subsequently
  • # of bytes added to page during event
  • # of bytes added to page subsequently
  • # of words added to page during event
  • # of words added subsequently
  • Current article class (where available)
  • # of pageviews to page (cumulative, from creation to now)
  • Avg. pageview to page per day
  • Does page still exist?

Wikitext

  • Page name / link
  • Short description/first sentence
  • Username of creator
  • Wiki
  • # of words added to page during event
  • # of pageviews to page (cumulative, from creation)
  • Avg. pageviews to page per day

Pages Improved  — downloadable reports  (NEW)

edit

CSV

  • Page name
  • Page URL
  • Username(s) of editor(s)
  • Wiki
  • # of edits to the page during event
  • # of bytes added to page during event
  • # of bytes removed from page during event
  • Net % change in page bytes during event
  • Bytes added to page subsequently
  • # of words added to page during event
  • # of words removed from page during event
  • Net % change in words during event
  • # of words added subsequently
  • Current article class (where available)
  • Avg. pageview to page per day
  • Does page still exist?

Wikitext

  • Page name / link
  • Short description/first sentence
  • Username of creator
  • Wiki
  • # of bytes added to page during event
  • # of words added to page during event
  • Avg. pageview to page per day

Give feedback here about these reports and the data in them.

Files uploaded — downloadable reports (NEW)

edit

CSV

  • File name
  • File URL
  • Description
  • Media type (image, video, audio, text, 3D)
  • Source
  • Username of uploader
  • # of pages in which file is placed
  • # of pageviews to file (from upload to now)
  • # of plays to uploaded video and audio files (from upload to now)
  • Avg. pageviews to file/day
  • Does file still exist?

Wikitext

  • File name/link
  • Description
  • Media type (image, video, audio, text, 3D)
  • Source
  • Username of uploader
  • # of pages in which file is placed
  • # of pageviews to file (from upload to now)
  • # of plays to uploaded video and audio files (from upload to now)

Give feedback here about these reports and the data in them.

Wikidata Contributions —  CSV and Wikitext downloadable reports (NEW)

edit
  • Item label and/or Q number
  • URL
  • Description
  • Username of editor(s)
  • Was item created? (y/n)
  • # of claims added to item
  • Does item still exist?

Give feedback here about these reports and the data in them.

August 17: Event Tool metrics—tell us what you want

edit

As I noted in my last post, better and more metrics will be a focus for the Event Tool project. I’m writing now to ask event organizers for your ideas about metrics and reporting: What data do you need to show? What are you looking to accomplish? What filtering and other features are most important?

To provide some prompts for your thinking, I’ve posted a breakdown of the reports available now from the existing tools Grant Metrics and the Program and Events Dashboard, which many event organizers have used. Our operating assumption is that the Event Tool will use Grant Metrics for reporting (in fact, it may ultimately be just a series of enhancements to Grant Metrics).  So as you look at the lists linked to above, pay particular attention to Grant Metrics, especially to any key features you think it lacks.

To help organize the discussion, I’ve listed some specific question areas on the talk page; please provide answers there. The categories are:

August 9, 2018: event tool feature ideas (part 1)—what do you think?

edit

As described below, the decision was made earlier to approach this initiative through a series of interconnected tools that address the needs of different types of program organizers. These new tools will complement existing tools like Grant Metrics.

The first tool under consideration—working title Event Tool (ET)—is designed to help people organize and run wiki events. Over the past few weeks I’ve been interviewing editathon organizers about their process and problems. The sections below look at the parts of their workflow I think new tools might improve. For each, I describe the process as it is now and propose some ideas for addressing the problems associated with it. Just as importantly, in each section you'll find a list of questions. If you’re an event organizer, please use the talk page to help guide our next steps. We’re listening. —JMatazzoni (WMF)

Assumptions and caveats:

  • Metrics a focus: In line with the  Wishlist proposal, easier and better data reporting about events is a core goal of this tool. I’ve described some efficiency and convenience features below as well.  But my expectation is that metrics will be the focus of initial product releases—unless event organizers tell us they have a different priority.
    • That said, this post doesn’t cover new metrics and reporting features per se; it covers the event workflow only from event creation through the day of the event.  Most of the ideas described below, however, set the stage for better metrics by providing new or improved ways for getting data into the system. Look for a post about data reporting in coming weeks.
    • Grant Metrics  will probably handle most or all data reporting for this tool. In fact, it’s not clear whether what we're presently calling the Event Tool (ET) will be a distinct tool in itself or simply a series of enhancements to Grant Metrics.
  • For organizers only:  The features described below are for organizers. With the exception of the Signup page, event participants will not see or use the ET.
  • Beyond editathons: My initial research has been with editathon organizers but we hope the functionality we create will be useful for organizers of other events as well—such as content drives. If you organize different types of wiki events, please let us know what types, whether the tools described would help you or not, and what we’d need to change to address your needs.

Step 1: Organizer creates an event in the system (and Event page on wiki)

edit
How it works now
  • Organizers manually build an Event Page on wiki. Some also enter events into the Dashboard, primarily in order to get metrics later.
Problems with current process
  • Effort duplication / manual copy and paste of data: E.g, copying participants’ names from a signup database (on EventBrite, say) to both the wiki page and to the metrics system. Or copying a list of articles from the Event Page to a metrics system.
  • Page creation is complex: Event pages can be difficult and time consuming to produce, which means organizers can’t delegate the process.
  • The Dashboard and Grant Metrics don’t accommodate all desired data types: E.g., organizers would like to report on all events with a certain partner or to limit reports to a list of suggested articles (the “worklist”).
Proposed solutions/feature ideas
  • Form-based data entry into a central event database: By entering event data into an Event database, we can achieve create-once-publish-many efficiencies. The Event Creator form can include new data types that organizers want but don't currently have in Grant Metrics.
  • Automated wiki page creation: Based on the event info, the  ET could generate selected sections of a  wiki Events page (or perhaps a complete but generic Event page).  
  • Automated wiki page updating: The ET might continue to update the automatically generated sections of the Event page. Organizers would still be able to customize the automatic wiki page by adding sections not under ET control.
Questions for organizers -- Post answers here!
  • What types of information would you like to be able to report on?
  • How important a feature is automatic wiki page creation and updating to you?
    • Are you willing to accept a new and perhaps more standardized page design to get auto  page creation?
    • What parts of the Event page change the most after creation so would benefit from automatic updating (e.g., the worklist?)? Is updating important to you?
  • If participants’ usernames were in a database you could consult any time, would you still need to publish them on the Event page? Why?

Step 2: Participant sign-up

edit
How it works now
  • Organizers want participants to RSVP for event-planning purposes. They also need to put user data in their systems for metrics and outreach purposes. The data collected and the signup mechanisms vary considerably.
Problems with current process
  • Data has to be transferred manually: E.g., by copying and pasting from EventBrite or a Google Doc to the wiki Event page, a MailChimp mailing list, the Dashboard for metrics…..
  • No email opt-in: Third party sites like EventBrite provide participants no opportunity to opt in for future event notification from the organizer.
  • Third-party services may not match Wikimedia privacy policies: EventBrite, for example, has its own privacy and security policies.
Proposed solutions/feature ideas
  • Standard participant signup form: Based on event data organizers add to the system, the ET can generate a standard signup form to acquire basic user data—name, email, username. The form will not be on wiki; participants will find it by following a link from the Event page. Organizers will  be able to export user data—e.g., to give to partner organizations.
  • Customizable form-setup facility: To accommodate the different types of user information organizers want to collect and track, we could provide limited flexibility in form configuration. E.g., organizers may be able to choose from a predetermined set of questions/elements, and to specify which are required or optional.
    • The form can’t include fully-configurable free-text elements, since organizers might use them to acquire data our privacy policies don’t allow the Foundation to store.  
Questions for organizers -- Post answers here!
  • If we offer only a standard signup form (at first), would that work for you? What would a standard form need to include?
  • If we can offer a tool that lets you select from a predefined set of signup form options, what would you need it to offer (e.g., questions about gender or age, a requirement to approve a safe-space policy...)? Be selective: the simpler this tool is, the more likely it is to get built.
    • In particular, should a checkbox for email-list opt in be standard or optional?

Step 3: Wiki account creation

edit
How it works now
  • Most organizers don’t require users to create a wiki account in advance, fearing it will discourage signups. This means that newbies often need to register on the day of. But for security reasons, the wikis allow only a limited number of accounts to be created from one IP during a given timeframe. This creates problems.
Problems with current process
  • Account creation wastes time during the event: Organizers can apply for a the Event Organizer right, which lets them create more than the allowed number of accounts from one IP. But the person with the Event Organizer right has to personally register the participants, which can cause significant delays at the start of events.
    • Even if everyone coming to an event had pre-registered for the wikis, on English Wikipedia and some others new users need to have Autoconfirmed status in order to create new articles.
  • Participants sometimes get blocked: This happens even when event organizers have played by all the rules.
  • Restricted choice of event leaders: The Event Organizer right can’t be delegated, so the person with that authority needs to be on hand at events personally to register users.
Proposed solutions/feature ideas
  • Promote and facilitate wiki registration during event signup: We may be able to integrate wiki registration directly into event signup page, creating a seamless process and ensuring that fewer attendees show up on the day of without usernames. If we can’t embed registration, we can create a signup flow that makes registration as attractive as possible.
  • Bulk account creation tool: No matter how attractive we make advanced registration, there will always be drop-ins attendees on the day of. To help Event Creators move more quickly, we could make a tool that would let them register multiple participants at one time.
  • Post messages to newbies’ user pages: To shield newly registered participants  from patrollers, we could post notices upon signup to their user pages announcing that they are part of an event and directing questions to the the event organizer.
Feedback from organizers -- Post feedback here!

Step 4: Participants check-in (day of)

edit
How it works now
  • Organizers want to know and document who actually came to the event. They may need to send this info to a local chapter or partner organization, and it can be useful later for when generating reports. Mechanisms for checkin vary widely. Some ask people to sign in on wiki. Some use a paper sheet. Others simply count anyone who edited certain pages during a given time as a participant.
Problems with current process
  • Manual data entry: E.g., of usernames into metrics systems.
  • Inaccurate metrics: Making assumptions about who attended can be misleading.
  • Lack of participant data: By putting attendance records into the database, organizers will be able to know, for example,  that certain users attended multiple events.
Proposed solutions/feature ideas
  • Class Roster with check-in facility for organizer: A class Roster for organizers would let organizers check in people who show up. The Roster should also let organizers add usernames to the records of users who signed up but didn’t register for the wiki in advance.
    • To help organizers monitor activity during the event, the Roster might include links to all attendees Contributions and Talk pages.
Feedback from organizers -- Post feedback here!

Reaching out to people who voted for this wishlist item:

edit

This project—inspired by the Wishlist item you voted for—is in the research and design phase now. Input from event organizers is essential! Please read the post above and watch this page if you have a continuing interest.

@Joalpe:,  @Ijon:,  @Krishna Chaitanya Velaga:, @Discott:, @David1010:,  @Liuxinyu970226:,  @Jc86035:,  @Megs:, @Thomas Obermair 4:, @Shizhao:, @Pbsouthwood:, @Bspf:,  @Donald Trung:, @Wikinade:, @Nick Moyes:, @Zhangj1079:, @OrsolyaVirág:,  @B20180:,  @Exilexi:,  @Rachel Helps (BYU):, @Mickey83:,  @Becksguy:, @VIGNERON:, @Ozzie10aaaa:, @Superchilum:, @Romaine:,  @Theklan:,  @JenOttawa:,  @:,  @Ckoerner:, @Arian:, @Kerry Raymond:, @Doc James:, @Muhraz:, @Hexatekin:, @Pamputt:, @Satdeep Gill:,  @NickK:, @Townie:, @Martin Urbanec:, @Vachovec1:, @Battleofalma:JMatazzoni (WMF) (talk) 18:36, 10 August 2018 (UTC)

@YjM:, @Theredproject:,  @Mckensiemack:, @Hasive:, @Nattes à chat:,  @Winged Blades of Godric:, @Meredithdrum:, @Artchivist1:, @Paucabot:, @FULBERT:,  @Rhododendrites:, @JaxieDad:, @PointsofNoReturn:, @Annamariandrea:, @Halibutt:, @Masssly:, @Voltaireloving:, @Sarahobender:, @Davidpar:, @HugoHelp:, @Vallue:, @Medol:, @Tiputini:,  @Monikasj:, @Stinglehammer:, @Caorongjin:, @John Cummings:, @Fixer88:JMatazzoni (WMF) (talk) 18:51, 10 August 2018 (UTC)

@Spiritia:, @Epìdosis:, @LornaMCampbell:, @Mtmlan84:, @Ouvrard:, @Mauricio V. Genta:, @Anthere:, @Kvardek du:, @Jorid Martinsen (WMNO):, @Uncommon fritillary:, @African Hope:, @Astrid Carlsen (WMNO):, @Richard Nevell (WMUK):, @Alangi Derick:, @Talueses:, @Wittylama:, @VMasrour (WMF):, @TrMendenhall:, @Ecritures:, @MichaelMaggs:,  @Pharos:, @Heathart:, @John Andersson (WMSE):, @Haxpett:, @Psychoslave:, @Anne-LaureM:, @Sylvain WMFr:, @Yohannvt:, @Lirazelf:, @ArtEmJJ:, @Jack who built the house:JMatazzoni (WMF) (talk) 19:01, 10 August 2018 (UTC)

@Chicovenancio:, @جواد:, @Xavier Dengra:, @לסטר:, @Esh77:, @Bijay chaurasia:  @Shangkuanlc:JMatazzoni (WMF) (talk) 19:41, 10 August 2018 (UTC)

@Jmatazzoni:JMatazzoni (WMF) (talk) 16:08, 13 August 2018 (UTC)

July 17, 2018: Are you an editathon organizer? Let’s talk about better tools!

edit

I’m Joe, a product manager at the Wikimedia Foundation, and I just inherited this project from Trevor (thanks for your help T!), As a first step I’m reaching out to people who organize editathons so I can learn more about your workflow and needs. I’d like to better understand what works for you and what doesn’t about your current tools and what ideas you have for tools that could make you more effective. I want to make sure I speak with organizers from countries other than the USA, though I need to conduct interviews in English.

If you’re an experienced organizer who could benefit from better technology for metrics, signups, event promotion or other parts of the editathon process, I’d I’d like to set up a time when we might chat via video conference (Google Hangouts). To respond, please:

  • Send me a contact email where I can reach you
  • Please tell me briefly how many and what type of editathons you’ve worked on and
  • Let me know what time zone you’re in (I’m in California time).

You can reach me on my talk page or here: jmatazzoni[at]wikimedia.org. I'm researching this right now, so please don't wait to get in touch. If you have ideas but don’t have time to talk, please talk to me on the project talk page. I’m looking forward to learning more about your work! —JMatazzoni (WMF) (talk) 17:52, 17 July 2018 (UTC)

April 25, 2018

edit

Sorry for the delay — a lot of progress has been made and I am looking forward to providing everyone with more information. A lot of this is duplicative of what I presented at Wikimedia Conference in Berlin last weekend, where I connected with an incredibly helpful handful of program and event organizers. The slides can be found here. The input and experiences were valuable and informative to my work on this project.

Survey

We wrapped the survey (see below) a few weeks ago. It received 31 responses from 27 different affiliates or organizations — thank you if you participated!

I'm working on getting these responses into a form appropriate to share on-wiki (per the WMF's survey policy.) In the open-ended survey responses, I've manually identified and documented 83 different times people described how they use the dashboard, 35 pieces of praise, 17 ideas, and 111 pieces of frustration. Some of these uses, praise, and frustrations are duplicative (which is great!) but there is a long list of one-off comments. The most common frustrations are about the limitations of the data available and the lack of connection to the wiki itself. The most common praise was about the Dashboard's ease of use.

Project scope

After reviewing the responses from the survey and discussing the bandwidth and strengths of the Community Tech team we have decided on a project direction. I've renamed this page and the associated project to "Tools for program and event organizers" because we are moving forward with this Wishlist project in a different manner than simply making changes to the dashboard.

Rather, our current project direction is to build tools that complement the existing Outreach Dashboard. We believe there is room for granular, tactical tools that address different use cases than the dashboard currently serves. There is a large amount of diversity among different programs from Edit-a-thons to GLAM content donation to 'Wiki Loves' type content drives. We believe we'll be able to deliver more value by building niche tools than expanding the dashboard. Additionally, a dashboard is only one type of tool and we will likely build out other types of tools. Some of these tools will likely work more closely on wiki, which isn't something the dashboard can do. In the end, we want to have a suite of interconnected tools that are useful to a wide audience of program and event organizers.

Our primary goal is to build highly useful functionality for concrete, defined target audiences (e.g. organizers of edit-a-thons, organizers of content drives, education organizers, wikimedians-in-residency, etc.) Many of these uses are somewhat available in the existing dashboard, when they are unique and could benefit from individual attention.

Next steps

The work falls on me (Trevor) as product manager to enumerate the potential target audiences and the pros/cons for focussing on each one. I'll then identify the most common use cases for these audiences. (Alex has a very strong suspicion that the two major areas will be cohort wrangling & data reporting, which I agree.) Once we all (WMF, and all project stakeholders here on wiki!) agree on a prioritized list, we'll start defining the specifics for each tool, including design so usability is a forethought of this project, not an afterthought.

As this "suite of tools" project crystalizes we will answer a lot of the questions swirling at the moment: How are campaigns represented in the tools? How do we (if at all) take advantage of our recent work on Grant Metrics? How does this relate to the hashtag tool or the worklist tool? etc. I'll be updating this project page as things progress and decisions are made.

It's looking like product definition will still take a few months, but I anticipate for us to be ready for development by August of this year, with something demonstrably by December. (Scope TBD 🙃)

Thanks for reading, please let me know if anything is unclear or if y'all have any questions! 🤠

March 15, 2018

edit

The survey has been submitted to WMF legal for review of the survey policy.

I met with Sati, who gave me some good advice. She has documented interviews with grantees that may be useful and will help review our findings from the survey. She also suggested we think about Proactive vs. Retrospectively dashboard use (e.g. using the dashboard as you prepare and run an event, vs only inserting data after the event has ended.)

Also, we may want to build a full roadmap, even if we can’t commit to development of it. We'll need to be clear about our scope and commitments.

March 9, 2018

edit

We have completed two in-person interviews over Google Hangout, and will be moving to an online questionnaire for program and event organizers. Our goal is to send this out next week (week of March 12) and have publish an anonymized summary the week of March 26. This input, along with feedback inside the dashboard itself and comments in the Wishlist proposal will help us determine which problems this project will solve. The target deadline for publishing these problems will be March 29, 2018.

The questions for the online questionnaire will be:

Basic/Demographic questions:

  1. What is your username?
  2. What is your background outside the free knowledge movement?
  3. How long have you been involved with Wikimedia programs or events?
  4. Which of the following are you most actively involved in organizing or facilitating?
    • Edit-a-thons
    • Editing Workshops
    • Conference
    • Wikipedia Education Program
    • GLAM Content Donation
    • Photo Drive or Event
    • On-wiki Writing Contest
    • Wikipedian in Residence
    • Hack-a-thon
    • Other Partnerships
    • Research Project
    • Tool Building Project
    • None
    • Other (Specify)
  5. Please briefly describe this work:

Dashboard use:

  1. How do you use the Dashboard to organize and communicate before a program or event?
  2. How do you use the Dashboard to organize and facilitate during a program or event?
  3. How do you use the Dashboard to organize and perform evaluation after a program or event?

Why the dashboard:

  1. What do you think are the greatest benefits of the Dashboard?
  2. What are your greatest frustrations with the Dashboard?
  3. What can you not do because of poor functioning of the Dashboard?
  4. What other tools/techniques do you use in addition to the dashboard? For what purposes?
  5. Any other comments

February 18, 2018

edit

We have completed two user interviews, and are awaiting a third. We will soon be moving to email-based questions to better manage our time. We will anonymize and summarize our findings here on wiki.

I hope to have identified some common problems we can address (#Problems to solve) by the end of March 2018.

We will also be speaking about the Dashboard and our work to improve it at the Wikimedia Conference 2018 in Berlin on April 20-22.

January 29, 2018

edit

We would like to conduct 3-5 interviews with program organizers to hear first-hand about the dashboard. The script for the 60-minute interviews is:

Basic/Demographic questions:

  1. What is your username?
  2. What is your background outside the movement?
  3. Which Wikimedia projects have you contributed to? What experience do you have?
  4. Have you supported work with an Affiliate?
    1. What kinds of programs do you organize?
  5. Which of these programs have you used for the Programs and Events Dashboard?
  6. How long and/or often do you use the Dashboard?
  7. Have you ever used project management software in other contexts? Are you familiar with classroom management tools in the educational context?  If yes explain:
    1. What was it? What worked well for you?
    2. Why?

Why the dashboard:

  1. Event stages
    1. How do you do communications and organizing before an program or event? Does it include the dashboard? Can you show me an example?
    2. How do you do communications and organizing during an program or event? Does it include the dashboard? Can you show me an example?
    3. How do you do communications, organizing and evaluation after a program or event? Does it include the dashboard? Can you show me an example?
  2. Why did you choose to use the programs and events dashboard for those programs instead of for other programs?
  3. What do you think are the greatest benefits of the dashboard?
  4. What do you think are the greatest challenges?
  5. Would you recommend the dashboard to affiliates why?
  6. Which features of the dashboard affect your ability to organize with an affiliate or campaign?
  7. What can you not do because of poor functioning of the dashboard or an alternative tool?
  8. What other tools/techniques do you use in addition to the dashboard? For what purposes?
  9. What are your favorite metrics or tracking tools for other Wikimedia programs? Why?

January 17, 2018

edit

Common workflows for Program/Event organizers:

  • Organizers can create different levels of structure: Campaign > Programs (aka Events)
  • Entry points to create Programs are not 100% intuitive, there’s room for usability improvements
  • ‘When’ and some other data fields are not available when creating an event, only editing an event, there’s room for usability improvements
  • Sometimes Volunteer activity (usually for Power Editors) will get into the system and muddle the data. This can be addressed by having the users sign up for specific articles so their other edits do not display for the program
    • E.g. Apples attends a ‘Books’ edit-a-thon but previously in the day they edit about Politics — only the edits about Books should show
  • Goal of Edit-a-thons that the Metrics in the Dashboard doesn’t currently display: Walk away with enough skills to be useful contributors someday
  • Attendees can provide feedback in the dashboard, but at in-person events this is done verbally
  • Participants can add article, see list of articles, see what others are doing, see some data
  • At the end of the event, organizers can…
    • Download data in a CSV file (one click download)
    • Only shows Wikipedia, Commons, and (some) Wikidata. Should show more of the projects to be fully useful
    • No visual display of the data, just numbers
    • What type of edits were made? Quality of these edits?
    • Metrics from Programs are rolled up into Campaigns
    • Is there a download for Campaigns data? TBD.
  • Training modules may be buggy, there is no assessment (quizzes, completion data) and are not required

January 12, 2018

edit

Based on initial conversations and reading, I've learned that programs are any type of time-bound organized contribution efforts, usually within a defined topic. The types of programs vary greatly in size, type, and complexity but the most common types are in-the-classroom education, Wikipedian-in-residence, edit-a-thons, and photo/edit/data drives. The Dashboard is used during three stages of program organization: planning, event logistics, and follow-up but excels at event management logistics and follow-up metrics gathering. It is user friendly (doesn’t require experience with wiki templates and markup) and is reliable for repeat organizers.

The biggest deficiencies in the Dashboard seem to be that it is built as a one-size-fits most but different groups want/need it to do things specific for their individual programs. I'm not sure if there’s anything significantly bad/wrong with the existing tool, but I expect there are some odd workarounds that users have developed which we could address. We’ll likely receive many specific yet varied requests from Education program organizers, and will also receive lots of requests for improving the Dashboard’s data/metrics reporting. We’ll probably hear some opposing pieces of feedback from organizers given the diverse uses, but that’s par for the course.

Next steps for this project: Alex will walk Trevor through the most common workflows for organizers (on wiki, the dashboard, and whatever other tools they use) when they start a new program, run their first event, and follow-up after the event/program has ended. From this I’ll create a user interview script, which I will use during calls with several real-world program organizers. I’ll hope to learn what parts of the workflow fail them, what they want to keep about the Dashboard, and what functionality is completely absent.

December 20, 2017

edit

In an initial meeting with the Community Programs team, we discussed what this project will look like. We agreed to start with a user research approach that will help us evaluate and prioritize the use cases we need to support.

We discussed the existing uses of the dashboard — metrics gathering, event management, training, and multi-event (campaign) tracking.

We discussed that some of the risks for this project will be Design (Community Tech will have more visual design resources in 2018), community ripple effects and fatigue due to new tool adoption.

December 13, 2017

edit

The team will start investigating this project early in 2018. If you've got suggestions or questions, please write your thoughts on the talk page!

Important links

edit