Mirroring Wikimedia project XML dumps

This page coordinates the efforts for mirroring Wikimedia project XML dumps around the globe, on independent servers, similar to the GNU/Linux .isos mirror sites. See the list of mirrors below for the dumps.

RequirementsEdit

SpaceEdit

We require 25.1 TB for the 5 most recent dumps (most desired option). This would be 3 sets of full dumps and 2 sets of partial dumps. This is based on estimates from December 2020.

Alternative options:

  • "most recent good dumps": 8 TB (July 2022 estimate). This would be one set of full dumps.
  • "last 2 good dumps": 11 TB (July 2022 estimate). This would be one set of full dumps and one set of partial dumps.
  • "All dumps and other data": ~ 75 TB and growing (as of July 2022).

Additional options:

  • "Historical archives": 1.6T now (October 2017). This consists of 2 dumps per year from 2002 through 2010. Not expected to change or grow.
  • "Other": 31 TB (Dec 2020). Pageview analytics, CirrusSearch indexes, Wikidata entity dumps and other datasets.

BandwidthEdit

Wikimedia provides about 4-5 MB/s via dumps.wikimedia.org for XML dumps, as of January 2023.

Current mirrorsEdit

Organization Contents Location Access Notes
Your.org All public data United States, IllinoisIllinois, United States The media files in the mirror may be outdated, please use with care. Have a look at the last modified date.

Media tarballs last updated in March 2013 (as of March 2022).

Internet Archive All public data (updated semi-manually) United States, CaliforniaCalifornia, United States Instructions for finding old Wikidata entity dumps (RDF and JSON) can be found on Wikidata:Database download.

See "Notes for the wikimediacommons collection" below.

Bytemark Last 5 good XML dumps United Kingdom, YorkYork, United Kingdom

Last updated 2022-08 (as of 2023-03-07)

C3SL Last 5 good XML dumps Brazil, ParanáParaná, Brazil
BringYour Last 5 good XML dumps United States, CaliforniaCalifornia, United States
Individual hoster Last 5 good XML dumps United States, ColoradoColorado, United States Defunct as of March 2021.
Individual hoster Last 5 good XML dumps United StatesUnited States
PDApps Last 4 good XML dumps; Wikidata entity dumps Russia, MoscowMoscow, Russia
Academic Computer Club, Umeå University Last 2 good XML dumps; 'Other' datasets Sweden, UmeåUmeå, Sweden
Scatter RDF dumps: Wikidata, Wikimedia Commons structured data, Categories Germany, SaxonySaxony, Germany
Center for Research Computing, University of Notre Dame Wikidata entity dumps, pageview and other stats, Picture of the Year tarballs, Kiwix openzim files, other. United States, IndianaIndiana, United States Access to this mirror is restricted to institutions with access to Internet2/ESnet/Geant. Those with access will have high bandwidth downloads.
Notes for the wikimediacommons collection
  • All the Commons uploads (and their description pages in XML export format) of each day since 2004, one zip file per day, one item per month. A text file listing various errors is available for each month, as well as a CSV file with metadata about every file of each day.
  • The archives are made by WikiTeam and meant to be static; an embargo of about 6 months is followed, in order to upload months which are mostly cleaned up. Archives up to early 2013 have been uploaded in August-October 2013 so they reflect the status of the time. After logging in, you can see a table with details about all items.
  • See Downloading in bulk using wget for official HTTP download instructions. Download via torrent, however, is supposed to be faster and is highly recommended (you need a client which supports webseeding, to download from archive.org's 3 webseeds): there is one torrent per item and an (outdated) torrent file to download all torrent files at once.
  • Please join our distributed effort, download and reseed one torrent.
  • Individual images can be downloaded as well thanks to the on-the-fly unzipper, by looking for the specific filename in the specific zip file, e.g. [1] for File:Quail1.PNG.

BitTorrentEdit

For an unofficial listing of torrents, see data dump torrents.

Potential mirrorsEdit

If you are a hosting organization and want to volunteer, please send email to ops-dumps@wikimedia.org with "XML dumps mirror" somewhere in the subject line.

Based on your space and bandwidth restrictions, decide how many dumps you want to mirror, whether you want to mirror in addition or alternatively the archives (pre-2009 dumps) and/or "other" datasets. Let us know that in the email. We'll need the hostname for our rsync config, the name for the ipv6 address if there is a separate name, or in case there is no ipv6 connectivity, a note to that effect, and a contact email address.

Once your information is added to our rsync config, you'll be able to pick up the desired dirs and files from the appropriate rsync module:

  • dumpslastone -- last complete good dump for each wiki as well as completed files from any run that is in progress
  • dumpslasttwo -- last two complete runs etc
  • dumpslastthree -- last three complete runs etc
  • dumpslastfour -- last four complete runs etc
  • dumpslastfive -- last five complete runs etc
  • dumpmirrorsother -- 'other' datasets (as seen at [2])
  • dumpmirrorsalldumps -- all dumps but no archives and no 'other' datasets
  • dumpmirrorseverything -- absolutely everything
  • dumpmirrorseverything/archives -- just the archival dumps of historical interest

We recommend a daily cron job for this.

If you are brainstorming organizations that might be interested, see discussion page.

See alsoEdit

External linksEdit