The translation project in the fundraiser was about translating all the materials related to the fundraiser: interface messages, appeals, emails, support texts and so on, into as many languages as possible, while still maintaining quality control over the translations. A key goal in the project was to make the translation process have a low threshold for participation for the users, so that people could participate even if they weren't well-versed in wikisyntax.
The goals for the project were:
- Get more translators involved, including new translators, to help complete work quickly
- Make it easy for someone to sign up and find new work
- Recruit widely
- Create a process that can scale and sustain
- Make sure translations are proofread and checked for quality before they are published
- Ensure that more than 1 person looks at a translation before it is marked as done
- Incorporate more feedback opportunities into the translation cycle
This is what we tried to do in order to meet our goals:
UI & TemplatesEdit
We started by rethinking the fundraiser translation hub and workflow from the perspective of a new translator trying to quickly find and complete work.
Language-focused request templateEdit
In this project we made and used a new template system for tracking translation progress. This template fills several purposes technically, but the main reason for doing it this way is that there is one template per language, where the status of all translations in that language is tracked.
The old system was one template per translation request (appeal, thank you page, etc.) that listed the status of the translation of that page in all languages. It fulfilled its purpose, but it made it cumbersome for translators to find what else needed translation in their language. With this new system, the template is instead focused on the translator's language, and shows the status of each translation request on each translated page. One can say that a language's template is "the hub" for that language.
Side-by-side source and translationEdit
To encourage translators to do more quality checking, we wanted to get both the source text and the translation on the same screen. By utilizing the "editnotice" template system, we are able to show the original English text in a collapsible box above the edit box. It is hidden by default, but this makes it easier to compare the translation with the original (so one doesn't have to skip back and forth between tabs).
We tried to create short, simple instructions to get new translators started, and included the instructions in the translation hub, at the top of each translation page, and in emails to new translators.
Preliminary recruiting was done by emailing translators to ask people to sign up to help translate. This brought in quite a few experienced Wikimedia translators in the beginning.
However, we wanted a much larger base of translators, and thought of other ways of recruiting. For some languages we posted messages in the village pumps, but that wasn't very efficient, so we thought of another way. We ended up having a sign-up form that recorded translators' user names, languages and email addresses, which were written into a Fundraising Translators group in SugarCRM for later contact. The translator's user name was also automatically added by a bot to the list of translators on the hub, to promote teamwork among translators. The form was linked from CentralNotice banners that we turned on sequentially for selected languages. For some languages the banner was only turned on for a few hours before we felt we had enough translators; for other languages it was on for as long as two days (see our recruitment tracking for the results of some of these campaigns). The recruitment brought in over 1,000 different translators for more than 50 different languages. After filling out the form, a translator was then directed to the translation hub to start work.
Secondary recruitment is done constantly via links on the translated appeals once they are live (in this case, that was via donation landing pages on donatewiki). If a translation exists for the user's language, a link at the bottom of the translation asks the reader to "help improve this translation," taking them to the editable translation page. If a translation doesn't exist, a link asking them to "translate this page" takes them to the translation hub, which guides the user through the process to create a new translation. This type of recruitment brought in more anonymous editors to contribute translations or translation improvements, which then went through the normal quality checking process before being published.
Engaging with participantsEdit
Because we captured sign-ups in a CRM, we were able to email translators based on how much work was needed for each language. (E.g., if everything was translated for a language, we wouldn't email the translators for that language.) New sign-ups also received a welcome email with instructions on how to get started. We ran email campaigns at least once a week, and saw an increase in translations each time emails went out. Unfortunately, we found our CRM's email system to be somewhat cumbersome, so in the future we might consider a different email system that still allows dividing the group by language, tracking the campaigns, etc., but without some of the bugs we encountered.
The original plan was to organize translators into groups by language, to be coordinated by one–three coordinators. As it turned out, we didn't do that to the extent that we planned, because SugarCRM gave us the ability to communicate directly with the translators for a given language. However, the coordinators were still an instrumental part of the quality control, as they kept close watch on the translations, keeping an eye out for vandalism and other disadvantageous edits.
There were in essence two types of coordinators, both of which were often already experienced when it comes to translations on Meta: the ones that we asked to be coordinators beforehand, who were specifically recruited for the rôle; and the ones who sort of took on the rôle underway in the process, and kept an extra eye out for the translations in their languages.
In past years the translations have used a status template system to mark a translation's progress (missing, in progress, needs proofreading, ready, etc.). This year, we wanted to put more emphasis on proofreading to ensure that all translations were quality checked by a second translator before being published. This was done by tracking the status templates for each language, and watching the changes as they happened. If a translator changed the status directly from 'missing' to 'ready', we would oftentimes set the status back to 'proofreading' so that we could ensure that a second translator would take a look.
We also stressed the importance of proofreading in the translation template itself, both in edit mode and in the instructions shown to translators, in emails, and on the main hub page.
When a language was almost completely translated, but had many pages that needed proofreading before being marked as 'ready', we would send an email to the translators asking them to proofread what was there.
We wanted to encourage readers of translations to tell us if they found errors, even if they weren't willing or able to fix these errors themselves. So, we repurposed the Article Feedback Extension into a translation rating tool on meta, to collect ratings on translation pages. Readers were asked to rate a translation for accuracy (faithfulness to the source text), writing style, and errors (typos, grammar, etc.). Here is an example of a Spanish translation with a few hundred ratings.
On the translations of the appeals, we put a link to the corresponding translation page on Meta. The link works so that:
- it does not show at all if the user came from an English-language wiki;
- if a translation has been published on donatewiki, it links to the corresponding translated page on metawiki (link reads "Improve this translation");
- if a translation has not been published, it links to the corresponding English-language page on metawiki (which has further instructions about how to translate it), but also gives an email address that the user can send a translation to (if they aren't comfortable with MediaWiki).
The linked worked quite well; we got quite a few translation submissions via email, and also saw increased activity on Meta for several languages. A few languages had problems with a lot of vandalism (probably) stemming from those links, but that was the exception and not the rule.
At the start of the project in July, this is what we expected success would look like:
- 3 major appeals and all core messages translated into all P1-3 languages (quantity)
- High-priority languages complete within 1 week of being made available for translation; Mid-priority languages complete within 2 weeks of being made available for translation (speed)
- Quality check complete and signed off on all P1 and P2 languages (38 total) (quality)
|Translators active||1676 (1038 IPs)||361 (46 IPs)|
|Pre-launch translations completed
|All translations completed||878||(no data)|
|All translations started||923||421|
|Average contributing translators
|Other text translated||10||8|
|Alan||Brandon||Jimmy 2||Jimmy 3||Kaldari||Susan||GorillaWarfare||Karthik||Jimmy 4|
|P1 translations (12 lang.)||11||12||12||10||10||10||5||8||5|
|P2 translations (15 lang.)||13||15||14||11||10||10||6||7||4|
|P3 translations (18 lang.)||18||17||18||15||15||15||6||8||5|
|All other translations (43 total)||11||13||29||18||11||8||2||4||5|
|Total P1–P3 translations||42||44||44||36||35||35||17||23||14|
|Total all translations||53||57||73||54||46||43||19||27||19|
|Banners and LPs||Banners 2||Thank you mail||Thank you page||FAQ||Jimmy mail||Problems donating||Recurring giving||Sue thank you|
|P1 translations (12 lang.)||10||7||9||8||5||4||5||3||4|
|P2 translations (15 lang.)||14||8||13||11||5||5||5||3||2|
|P3 translations (19 lang.)||18||11||18||14||6||5||5||3||3|
|All other translations (43 total)||31||12||9||8||3||4||2||1||2|
|Total P1–P3 translations||42||26||40||33||16||14||15||9||9|
|Total all translations||73||38||49||41||19||18||17||10||11|
|Contributors||Days to complete||Quality rating (accurate / well-written / error-free)||Total quality raters||Iterations (improvements) published|
|Arabic||42||1||3.7 / 3.6 / 3.3||110||5|
|German||19||3||4.0 / 3.9 / 4.0||56||3|
|Spanish||42||6||3.6 / 3.6 / 3.6||283||2|
|French||24||2||4.0 / 3.7 / 3.8||55||2|
|Hindi||4||17||4.3 / 4.3 / 4.0||3||2|
|Italian||33||13||3.8 / 3.8 / 3.7||277||7|
|Japanese||17||17||3.4 / 3.4 / 3.8||60||2|
|Dutch||34||14||3.9 / 3.7 / 3.9||85||2|
|Portuguese (Portugal)||42||6||3.7 / 3.6 / 3.5||195||2|
|Portuguese (Brazil)||9||9||3.4 / 2.8 / 3.6||7||3|
Our quantity target was to have 3 major appeals and all core messages translated into all P1-3 languages. We fell just slightly short of this mark, missing 1 translation for the Jimmy 002 appeal, 1 translation for the Brandon appeal, and 3 translations for the Alan appeal. However, in addition to these three appeals, we had an additional six appeals for translation, and a total of 290 translations were completed for all appeals in all 55 P1-3 languages. We also completed 204 translations for 10 supporting texts in these languages. If we had focused only on 3 main appeals and 1 batch of supporting texts, we probably could have hit this target more easily, but during the course of the project we thought it was more important to translate as many appeals as possible.
Our target for speed was to have high-priority languages complete within 1 week of being made available for translation. During the course of the project, we found that breadth, quantity, and quality checking were most important to focus on, which meant that speed was often sacrificed in the process. But still, looking at the snapshot of the main Jimmy appeal for our top 10 languages, you can see that we had five out of the ten translated within seven days, and only two out of the ten took longer than 14 days to have ready.
Our quality target was to have quality checks complete and signed off on all P1 and P2 languages. Because nothing was considered done until it had gone through a quality check, these numbers are the same as noted above for quantity – 494 translations were quality checked for P1-P3 languages. There were an additional 33 translations for these languages that did not get published because proofreading was not completed.
Languages that weren't specifically targeted as priority still had a total of 320 completed translations for appeals and support texts. Overall, we completed 878 translations for the 2011 fundraiser, which is more than twice as many translations as were even started in 2010.
- We don't have any hard numbers to back this up with, but I feel it's not a stretch to assume that much of the activity from unregistered users came via that link.