Research:MoodBar/Email confirmation
Pilot: Early data (July 2011 - September 2011) |
Stage 1: Usage and UX (September 2011 - May 2012) |
Stage 2: Impact on editor engagement (May 2012 - September 2012) |
|
Key Personnel
editProject Summary
editThis project explores few key statistics about usage of email as a means of notification on the English Wikipedia, with a specific focus on users going through the MoodBar email confirmation funnel.
- How many registered editors have a confirmed email?
- How many MoodBar editors/users have a confirmed email?
- How long does it take editors to confirm their email address?
Methods
editData have been collected from a replicated instance of the English Wikipedia DB. We collected data from the mw:user_table. Here we focus the analysis only on users registered on the English wiki but our scripts could be reused for other wikis where MoodBar has been deployed.[1] We compare the timestamp of local user registration with the one stored in the mw:CentralAuth database, and excluded users who registered elsewhere by checking if the registration timestamp recorded locally on enwiki is posterior to the timestamp recorded in the CentralAuth database.
When users post an item of feedback using MoodBar they are asked to register an email address, in case they did not have already a confirmed address. Since we want to analyze the effect of the MoodBar funnel on the volume of email confirmations, we focus only on users registered after the deployment of the phase 3 of MoodBar (see MoodBar's status document).
Definitions
editWe look at three categories:
- all users that registered an account within the eligibility window.
- active users that, additionally to register an account, clicked at least once on the “Edit” button, and hence have the MoodBar extension active.
- MoodBar users that, additionally to activating the extension, also used it to send at least an item of feedback. Of this group we further distinguish based on the MoodBar-related treatment.
- Feedback: users who sent at least a feedback item but never received any answer
- Feedback+Response: sent at least a feedback and received at least a response, but did not mark any response as “Helpful”
- Feedback+Helpful: sent at least a feedback, received at least a response, and marked at least one response as “Helpful”
The eligibility window for this analysis runs from December 14, 2011, to June 14th, 2012 and it is further subdivided into three periods, as shown in Table 1. Users are assigned to a period based on their registration timestamp.
Period | Starting date | Ending date | Description |
---|---|---|---|
historical | December 14, 2011 00:00:00 | May 22, 2012, 23:59:59 | MoodBar phase 3 |
treatment | May 23, 2012, 00:00:00 | June 13, 2012, 23:59:59 | Enhanced tooltip deployed.[2] |
control | June 14, 2012 00:00:00[3] | June 29, 2012 23:59:59 | MoodBar blackout |
For more information on the meaning of the three periods, see the experimental design document and the results of the controlled experiment.
Metrics
editWe report:
- Proportion of accounts with confirmed email address across: all users, active users (as tracked by the EditPageTracking extension), and MoodBar users, with breakdown by treatment.
- Average lag between account registration and email confirmation.
- Percentage of MoodBar users who confirmed the email after posting their first MoodBar feedback item.
Results
editFigure 1 shows that the percentage of accounts with confirmed emails is higher among MoodBar users (40%) than in the population of active users (30%) and of all registered accounts (31%). The lag between account registration and email confirmation, varies across the three groups: 30 days for all users, 2.38 days for active users, and 10.5 days for MoodBar users. The higher confirmation lag found among MoodBar users might be a consequence of the possibility to register an email at the end of MoodBar funnel. To test this hypothesis we focus on the sample of MoodBar users. There are 7,789 belonging to this category. Of them, 3,319 confirmed their email after posting their first MoodBar feedback (42%). These results suggest that the MoodBar funnel provides an effective context for asking users to register an email address.
We now focus only on active users (the reference) and active MoodBar users. We looked at the breakdown by treatment type in the three periods. Table 2 reports the results. Confirmed emails are more common among MoodBar users who marked as “Helpful” at least one response to one of their feedback items (Feedback+Helpful, 63% and 58% in the historical and treatment groups. This suggest, albeit with some caution, that email notifications might be an effective tool for learning about responses to feedback items posted with MoodBar.[4]
period | treatment type | group size | no. confirm. emails | % confirm. emails | confirmation lag (days) | confirm. lag std |
---|---|---|---|---|---|---|
historical | Active users (Reference) | 479437 | 142775 | 29.78 | 1.70 | 14.77 |
historical | Feedback | 7952 | 2966 | 37.30 | 6.73 | 25.53 |
historical | Feedback+Response | 4548 | 1864 | 40.99 | 6.69 | 24.23 |
historical | Feedback+Helpful | 840 | 533 | 63.45 | 11.07 | 31.10 |
historical | 492777 | 148138 | 30.06 | 1.90 | 15.34 | |
treatment | Active users (Reference) | 56512 | 16540 | 29.27 | 0.87 | 14.55 |
treatment | Feedback | 882 | 283 | 32.09 | 1.97 | 8.70 |
treatment | Feedback+Response | 1479 | 487 | 32.93 | 2.18 | 8.68 |
treatment | Feedback+Helpful | 131 | 76 | 58.02 | 2.60 | 7.26 |
treatment | 59004 | 17386 | 29.47 | 0.93 | 14.32 | |
control | Active users (Reference) | 36913 | 11347 | 30.74 | 0.74 | 3.55 |
control | 36913 | 11347 | 30.74 | 0.74 | 3.55 | |
Total | 588694 | 176871 | 30.04 | 1.73 | 14.77 |
References
edit- ↑ The dutch wiki, the french wikisource, and others all have MoodBar deployed, see here and here.
- ↑ This deployment involved an enhancement of MoodBar's UI, as proposed here, and it was required in order to increase the sample size for our experiment on the effect of MoodBar on user retention.
- ↑ Users registered in this day are further excluded from the analysis because deployment of the blackout code takes several hours and during this time it is not possible to ascertain whether a user had MoodBar enabled or not.
- ↑ We lack data on how many users actually saw a response but did not mark it as helpful, which means that in Feedback+Response there are both users who did see the responses and users who didn't.