Users time on Facebook increases 48%; overall Internet time increases 5.6%Edit
The following chart (using comScore data) shows how users time online has shifted over the past year. While overall time spent online has increased by about 6%, time on Facebook alone has increased 48%. The category "Other" increased 49%. Time on WMF sites has remained relatively flat (around 12 minutes/user/month).
An open question is how the shift towards Facebook/social networking/gaming is impacting people's choice to edit Wikipedia. This data is for worldwide internet users, which for WMF projects consists mainly of readers. More work is required to understand the impact this trend may be having on editors and potential editors. Howief 00:45, 22 December 2010 (UTC)
2010 Fundraiser Kicks Off -- CTR and Conversion NumbersEdit
The 2010 Fundraiser officially kicked off today. So far, the results have been very, very strong. The fundraiser started a slow ramp last Friday. Yet even during the ramp-up period (Friday-Sunday), the Fundraiser generated over $400k in donations per day. Those interested in tracking the contributions per day may visit the Contribution Statistics page. An early thanks to all of our donors!
In addition to the total donation amounts, I'll publish click-through and conversion rates as I know those of are great interest to web folks. I'm frequently asked "what's a typical click-through rate for a banner?" and "what's a typical conversion rate for a credit card page?" All of the fundraising data is open, so I'm hoping this will provide a reference point for the answers to these questions. Keep in mind that the Wikimedia Foundation fundraiser is unique in many ways (e.g., one of the largest online fundraisers of its kind, mission which many people feel strongly about, etc.). But nonetheless, the click-through rates and conversion rates could be a useful reference for people doing web related projects. Though this data hasn't been made available yet, here is some data from previous tests the fundraising team ran on October 26 and 28.
Click through rate The click-through rates for the banners ranged from 0.33% to 2.87% (see example). The banner that performed the best was a graphical banner of Jimbo.
|Banner Description||Click through rate||Total donations (USD)||Total donations||Average donations (USD)|
|Admit it- without Wikipedia, you never could have finished that report.||0.90%||$2901.08||140||$20.72|
|You depend on $sitename for information
Now it depends on you.
|"Thanks for the brain massage"||0.33%||$582.21||19||$30.64|
|"Jimmy appeal"||2.87 %||$47433.28||1537||$30.86|
|Only one Landing page was used for the 3 trial banners: Appeal14|
Conversion rate After clicking on a banner, the user goes to a landing page which the donation and payment information (see example). The conversion rates on these landing pages range from 0.89% to 1.84%. These conversion rates seem pretty typical. Also, you'd expect these conversion rates to have less variation than the CTR of banners since these users have self-selected by clicking on a banner.
|Landing page Description||Donation rate||Total donations (USD)||Total donations||Average donations (USD)|
|Ask string and separate buttons for Paypal and Creditcards, remove most extra links from donation side of screen||1.84%||$12,462.08||463||$26.92|
|Radio buttons to select type of CC or paypal and 1 donate button||1.71%||$11,585.85||428||$37.07|
|Ask string and separate buttons with spot for email and optout checkbox||1.75%||$11,835.54||429||$27.59|
|Ask string and only 1 donate button going to CC form||0.89 %||$7,395.17||224||$33.01|
|Ask string and only 1 donate button going to PP form||1.53 %||$9,610.43||383||$25.09|
|Only one Banner was used Testing 50 'Jimmy campaign' 10% each landing page (50% total)|
I'll post CTR and conversion rate data from the Fundraiser as it becomes available.
Howief 23:54, 15 November 2010 (UTC)
Rating Articles on Wikipedia -- Quality and ParticipationEdit
A few weeks ago, the Public Policy Initiative launched an experimental feature called Article Feedback. This feature allows anyone to rate a Wikipedia article along four dimensions: Well-sourced, Neutral, Complete, and Readable. The ratings from this feature should provide both readers and editors of Wikipedia a measurement of how people perceive the quality of Wikipedia articles. The feature has been applied to a wide range of articles, including the highly trafficked pages such as United States Constitution, Don't Ask, Don't Tell, and Education in the United States. It has also been applied to pages that receive smaller readership such as Tennessee Valley Authority, War on Poverty, and the Four Freedoms. The feature has gotten quite a bit of use so far, with over 12k reviews since it launched about ~6-7 weeks ago.
There has been interesting data that has come out of the experiment around how users rate articles. For example, we know that anonymous users tend to give higher ratings than registered users (users with a Wikipedia account). This is not surprising considering that registered users include Wikipedia editors, who are known to be very critical of their own work. Because it appears that different user groups rate articles differently, we may want to enable readers of a Wikipedia article to separate ratings from specific groups of users. Here are some graphs that show the difference between anonymous and registered users.
Another set of interesting questions is around participation. Can ratings change the way people participate with Wikipedia? If so, how?
Right now, the main way for users to start contributing to Wikipedia is by editing. A user needs to click on the edit button, learn a little bit of wikitext, make their edit, and click save. As many have pointed out, this is not the easiest process. Wikitext can be daunting for a number of users. There is also plenty of anecdotal evidence from readers that it simply doesn’t occur to them that they can edit a page. (Next time someone mentions something they read in a Wikipedia article, ask them "Did you know you can edit that article?" Most likely, they will say "no" or something along the lines of "I know anyone can edit Wikipedia, but I never thought of editing.)
In his recent book Cognitive Surplus, Clay Shirky writes at length about the subject of Participation. One of the themes in his book is the difference between "doing nothing" and "doing something". In Shirky's view, the quality of creative work exists along a continuum. Some is work is exquisite, some work is awful, and most work is somewhere in between. Participation, however, has a slightly different nature. In Shikey's words, the "real gap is between doing nothing and doing something." Over the past few years, technology combined with social trends have enabled more people now than other time time in history to cross that gap.
Most Wikipedia readers, however, "do nothing" (at least as it relates to Wikipedia). This is not to say they don't derive value from it – readers certainly gain from the knowledge contained in the articles. But when it comes to contributing to the encyclopedia, they don't.
The question then becomes How do we take the almost 400m users to Wikipedia from doing nothing to doing something? Through this lens, ratings are a way to get users to “do something.” Rating an article is much easier than editing an article. All you need to do is read the article and click on some stars (hopefully, you would have given some thought as to the ratings as well). Ratings provide a low-barrier way for people to contribute to Wikipedia. And the numbers show that readers do want to contribute. Even with the experimental tool at the bottom of the page, many pages have received over a hundred ratings to date. If ratings tool were put in a more prominent area, I'm sure these articles would easily get 2-3x the number of ratings we get today. So from our early data, it looks like ratings are a good way to get users to “do something.”
But how valuable is this "something"? There are a few ways these ratings can be valuable. We already discussed how ratings can be a meterstick of quality. But what if all articles get an average of 4 stars? It remains to be seen just how these ratings compare to quality, and we’ll be conducting some more experiments to get a better handle on this.
Ratings can also help bridge the reader and editor communities. Wikipedia articles are in a constant state of evolution and input from readers can help shape how an article develops. In the future, we may look into features where readers can leave comments for editors such as “It would be great if this article included a discussion of. . .” or “This article appears to be biased in favor of. . .” A well-constructed rating/feedback tool could be the place where very meaningful dialog between the reader and editor communities occur.
Lastly, the feedback tool can serve as an on-ramp to editing Wikipedia. Once a user “does something”, will they be more inclined to do more? Will they be more inclined to edit the article they just rated? This idea is something I'm personally very interested in. Our early data indicate that about 0.35% of users without an account edit an article after they’ve rated it (Note – this is a very rough approximation as some of these users may have edited at some point in the past, just not the article that they rated). That’s a really small number. But let’s put this number in context. Wikimedia projects as a whole have approximately 400M unique visitors a month. Each month, about 85K, or 0.02%, of these visitors contribute. Maybe 0.35% is actually pretty good.
Howief 21:36, 12 November 2010 (UTC)
Why do users create accounts on Wikipedia? What happens afterwards?Edit
Frank has been doing some very interesting analysis on account creation on various Wikipedia projects. His analysis is part of the Account Creation Improvement Project, a project aimed at "increasing the number of people who create a user account and actually start editing." Part of this project is to remove barriers to account creation. For example, does the English Wikipedia really need require a captcha for everyone? The other part is to increase the likelihood of a user actually editing a page once the account is created. Great goals.
Frank analyzed account creations in August and came up with some cool stats. First, he looked at the percentage of users that edited after creating an account:
About 31% of users who create accounts on the English Wikipedia end up making at least one edit within the next 10 days. This number is actually higher than I expected it to be. From our user research, we found that users created accounts for any number of reasons. For example, some thought they'd get more features, some wanted to track articles for reading, etc. Also, editing Wikipedia isn't exactly the most straightforward thing to do. A 30% number suggests that there is a significant chunk of users that create an account for the expressed purposes of editing an article. Maybe they thought an account was required for editing?
Next, Frank looked at the velocity of the edits. The following chart shows, of the users who created an account and edited, what percentage edited on the same day they created the account:
with at least one edit
within the first 10 days
with at least one edit
on the day of
So if you're going to edit, you're probably going to edit the same day you create your account. I'd like to try different interventions right after account creation to see what affect they have on long term editing behavior.
Here's some data on users that edit their user page after account creation:
with at least one edit
within the first 10 days
who edit their user
page at least once
on day 1
who edit their user
page at least once
within the first
I'm not surprised by this. My bet is that many of these users don't know they even have a user page. It's not like the site tells them or guides them through a profile creation process (as Facebook and almost every other site that has user pages does). I think it would be interesting to test a user page creation form right after account creation.
Interesting stuff (old)Edit
I come across a lot of interesting research/analysis on the Wiki(p|m)edia Community, so I thought I might use my userpage to track this stuff. There will likely be a focus on better understanding user behavior using both quantitative and qualitative data. I'll start by listing some simple links and will add some commentary at a later point:
- RfA Drought and Emergence of Wikigeneration Gulf
- Newbie Article Deletion Study
- Why Do Editors Leave Wikipedia?
- Account Creation Studies:
Umeå University in Sweden.Edit
My name is Sophie Österberg and I am the education manager at Wikimedia Sweden. I am currently working with a dedicated team at Umeå University who are about to embark upon researching ways to make Wikipedia more interactive and personalised through the use of mobile applications. At this stage we have nothing tanglible to show but I simply wanted to let you know and keep you and your team in the loop. I am following your and your team's work with mobile applications with great interest.
In case you check this page more often: I left you a message on en wiki at http://en.wikipedia.org/wiki/User_talk:Howief#Reasons_for_editors_leaving_Wikipedia --Piotrus (talk) 07:10, 12 May 2013 (UTC)