Research:Data
There is a great deal of publicly-available, open-licensed data about Wikimedia projects. This page is intended to help community members, developers, and researchers who are interested in analyzing raw data learn what data and infrastructure is available.
If you have any questions, you might find the answer in the Frequently Asked Questions about Data. If you still have questions, you can email your question to the Analytics mailing list (more information).
If you wish to browse pre-computed metrics and dashboards, see statistics.
If this publicly available data isn't sufficient, you can look at the page on private data access to see what non-public data exists and how you can gain access.
See also inspirational example uses.
Also consider searching for datasets at Zenodo, Figshare, Dimensions.ai, Google Dataset Search, Academic Torrents, DataHub (historical) or Hugging Face (see also a curated "Wikimedia Datasets" list on Huggingface).
Quick glance
editBy access method
editDumps of all WMF projects for backup, offline use, research, etc.
- Wiki content, revisions, metadata, and page-to-page and outside links
- XML and SQL format
- once/twice a month
- large file sizes
- The dumps.wikimedia.org domain also hosts other data
- The MediaWiki API provides direct, high-level access to the data contained in MediaWiki databases over the web.
- Meta info about the wiki and logged-in user, properties of pages (revisions, content, etc.) and lists of pages based on criteria
- JSON, XML, and PHP's native serialization format
- The Wikimedia REST API provides page content in various formats.
- The Wikimedia Analytics API provides pageviews and aggregate edit stats.
Data Services allows Wikimedia Cloud Services users to query a sanitized copy of the Wikimedia MediaWiki databases.
Raw pageviews, unique device estimates, mediacounts, etc.
- Delimited, usually: Project, (Page title,) Count
- Aggregated hourly or daily
- pageviews complete – mediacounts – unique devices
Reports based on data dumps and server log files.
- Unique visits, page views, active editors and more
- Intermediate CSV files available
- Graphical presentation
DBpedia extracts structured data from Wikipedia. It allows users to run complex queries and link Wikipedia data to other data sets.
- RDF, N-triplets, SPARQL endpoint, Linked Data
- Billions of triplets of info in a consistent ontology
A collection of various Wikimedia-related datasets.
- Smaller (usually one-time) surveys/studies
- DBpedia-Live and others
- Figshare (datasets tagged 'wikipedia')
Differential privacy homepage
A collection of differentially-private datasets, released daily, weekly, or monthly.
- pageview data
- editor/edit data
- centralnotice data
- search data
By data domain
editThe table below is a quick reference of data sources organized by data domain. For a more detailed overview of Wikimedia data domains and how to access data in each domain, use the links in the table or see Research:Data introduction.
Data dumps
editWMF releases data dumps of Wikipedia, Wikidata, and all WMF projects on a regular basis, as well as dumps of other Wikimedia-related data such as search indices and short URL mappings.
Content
editXML/SQL dumps
edit- Text of current and/or all revisions of all pages, in XML format (schema)
- Metadata for current and/or all revisions of all pages, in XML format (schema)
- Most database tables as SQL files
- Page-to-page link lists (
pagelinks
,categorylinks
,imagelinks
,templatelinks
tables) - Lists of pages with links outside of the project (
externallinks
,iwlinks
,langlinks
tables) - Media metadata (
image
,oldimage
tables) - Info about each page (
page
,page_props
,page_restrictions
tables) - Titles of all pages in the main namespace, i.e. all articles (
*-all-titles-in-ns0.gz
) - List of all pages that are redirects and their targets (
redirect
table) - Log data, including blocks, protection, deletion, uploads (
logging
table) - Misc bits (
interwiki
,site_stats
,user_groups
tables)
- Page-to-page link lists (
- Stub-prefixed dumps for some projects which only have header info for pages and revisions without actual content
See a more comprehensive list of what is available for download.
Other dumps
editDumps.wikimedia.org offers various other database dumps and datasets, including
- Adds/changes dumps (includes no moves or deletes, plus some other limitations) (documentation)
- Wikidata entity dumps – see Wikidata:Data access for more information
- Various analytics datasets (described below)
Download
editYou can download the latest dumps for the last year (dumps.wikimedia.org/enwiki/ for English Wikipedia, dumps.wikimedia.org/dewiki/ for German Wikipedia, etc). Download mirrors offer an alternative to the download page.
Due to large file sizes, using a download tool is recommended.
There are also archives. Many older dumps can also be found at the Internet Archive.
Data format
editXML dumps are in the wrapper format described at Export format (schema). Files are compressed in gzip (.gz), bzip2/lbzip2 (.bz2) and .7z formats.
SQL dumps are provided as dumps of entire tables, using mysqldump.
Some older dumps exist in various formats.
How to and examples
editSee examples of importing dumps in a MySQL database with step-by-step instructions.
Existing tools
editSome tools are listed on the following pages, but these tools are mostly outdated and non-functional:
License
editAll text content is multi-licensed under the Creative Commons Attribution-ShareAlike 3.0 License (CC-BY-SA) and the GNU Free Documentation License (GFDL). Images and other files are available under different terms, as detailed on their description pages.
Support
edit- Mailing list: xmldatadumps-l
- Bug reports: Dumps Generation project in Phabricator
- Design work on Dumps 2.0 replacement: Dumps Rewrite project in Phabricator
MediaWiki API
editThe MediaWiki API provides direct, high-level access to the data contained in MediaWiki databases. Client programs can log in to a wiki, get data, and post changes automatically by making HTTP requests.
Content
edit- Meta information about the wiki and the logged-in user
- Properties of pages, including page revisions and content, external links, categories, templates,etc.
- Lists of pages that match certain criteria
- See the full list of available information
- See also additional information for Wikidata and Wikidata's SPARQL query endpoint
Endpoint
editTo query the database you send a HTTP GET request to the desired endpoint (example https://en.wikipedia.org/w/api.php for English Wikipedia) setting the action parameter to query
and defining the query details the URL.
How to and examples
edit- API Tutorial
- Example:
https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=content&format=xml&titles=Main%20Page
fetches (action=query
) the content (rvprop=content
) of the most recent revision of Main Page (titles=Main%20Page
) of English Wikipedia (https://en.wikipedia.org/w/api.php?
) in XML format (format=xml
). You can paste the URL in a browser to see the output. - More examples
Existing tools
editTo try out the API interactively on English Wikipedia, use the API Sandbox.
Access
editTo use the API, your application or client might need to log in.
Before you start, learn about the API etiquette.
Researchers could be given Special access rights on case-to-case bases.
License
editAll text content is multi-licensed under the Creative Commons Attribution-ShareAlike 3.0 License (CC-BY-SA) and the GNU Free Documentation License (GFDL).
Support
edit- Frequently asked questions (FAQ)
- mediawiki-api mailing list
Wiki Replicas
editThe Wiki Replicas (part of WMCS wikitech:Portal:Data Services) host sanitized versions of Wikimedia production MediaWiki databases.
Content
editUsers of various Wikimedia Cloud Services products can access the wiki Wiki Replicas databases that host sanitized copies of the databases of all Wikimedia projects including Commons.
Data format
editExplore the database schema of the MediaWiki software.
How to
editSee the Wiki Replicas page on Wikitech on how to access the Wiki Replicas.
Support
editSee wikitech:Help:Cloud Services introduction#Communication and support
Recent changes stream
editSee EventStreams to subscribe to Recent changes on all Wikimedia wikis. This broadcasts edits and other changes as they happen.
Existing tools
editAnalytics Datasets
editAnalytics Datasets on dumps.wikimedia.org offers stable and continuous datasets about web request statistics (including page views, mediacounts, unique devices), page revision history, data by country, and Wikidata QRanks.
Pageview statistics
editPageview statistics are one example. Each request of a page reaches one of Wikimedia's Varnish caching hosts. The project name and the title of the page requested are logged and aggregated hourly.
Files starting with "project" contain total hits per project per hour statistics.
Per-country pageviews data is also available, sanitized for privacy reasons. See this announcement post (June 2023).
See the README for details on the format.
You can interactively browse the page view statistics at https://pageviews.toolforge.org. More documentation on the Pageviews Analysis tool is available.
Clickstream data
editThe Wikipedia clickstream dataset contains counts of (referrer, resource)
pairs extracted from the request logs of Wikipedia.
Geoeditors
editThe public "Geoeditors" dataset contains information about the monthly number of active editors from a particular country on a particular Wikipedia language edition (bucketed and redacted for privacy reasons). For some earlier years, similar data is available at [1]/[2], see also Edits by project and country of origin.
Misc datasets
editAdditional datasets (mostly irregular or discontinued ones) are published at https://analytics.wikimedia.org/datasets/. These include Caching research data, and AS Performance Report.
WikiStats
editWikistats is an informal but widely recognized name for a set of reports which provide monthly trend information for all Wikimedia projects and wikis.
Content
editMany dashboards that display trends about reading, contributing, and content broken down by different projects such as:
- unique visitors
- page views (overall and mobile only)
- editor activity
- article count
Data format
editData is presented as charts with the option to download the underlying data.
Support
editFor more details on Wikistats, see wikitech:Data Platform/Systems/Wikistats 2.
DBpedia
editDBpedia.org is a community effort to extract structured information from Wikipedia and to make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia and to link other datasets on the Web to Wikipedia data.
Content
editThe English version of the DBpedia knowledge base describes millions of things, and the majority of items are classified in a consistent ontology (persons, places, creative works like music albums, films and video games, organizations like companies and educational institutions, species, diseases, etc.). Localized versions of DBpedia in more than hundred languages describe millions of things.
The data set also features:
- about 2 billion pieces of information (RDF triples)
- labels and abstracts for >10 million unique things in up to 111 different languages
- millions of links to images, links to external web pages, data links into external RDF datasets, links to Wikipedia categories, YAGO categories
- https://www.dbpedia.org/resources/ has download links for all the data sets, different formats and languages.
Data format
edit- RDF/XML
- Turtle
- N-Triplets
- SPARQL endpoint
Access
edit- https://dbpedia.org/sparql is DBpedia's SPARQL endpoint.
License
edit- DBpedia data from version 3.4 on is licensed under the terms of the Creative Commons Attribution-ShareAlike 3.0 License and the GNU Free Documentation License.
Support
editDataHub
editThe Wikimedia organization on the Open Knowledge Foundation's DataHub was established by the Wikimedia Foundation around 2013, and contains a collection of datasets about Wikipedia and other projects which mostly date from around 2013-2016.
Wikivoyage also maintains data on its own DataHub:
- Hotels/restaurants/attractions data as CSV/OSM/OBF
- Tourism guide for offline use
Differential privacy
editThe WMF privacy engineering team uses differential privacy to release data that would otherwise be too sensitive to release. This data currently only includes pageview statistics; in the future, it will include statistics about editors, centralnotice impressions and views, search, and more.
Content
edit- Pageview data (currently only available as daily TSVs)
Data format
editDifferentially-private data is currently available in static TSV form at https://analytics.wikimedia.org/published/datasets/. Work to make this data available via API is ongoing.
License
editDifferentially-private data and code is available under a Creative Commons Zero license.