Talk:Data dump torrents

Latest comment: 4 days ago by Asdfghjohnkl in topic Uncompressed size

Translate tags

edit

If someone could fix the translate tags and explain to me how they work, that would be great, thanks. --Thibaut120094 (talk) 15:27, 6 December 2018 (UTC)Reply

20190120 enwiki torrent is not linked

edit

There's no link for any 2019 English-language torrents. I hate to burden Wikimedia with downloading the 20190120 enwiki, but I don't know how to get it otherwise.

Where are the checksums

edit

Shouldn't we have checksums here or some information on where to find them? Seems like a good idea to have some pointer to this information so people know that they are actually getting the download they think they should be getting? --Fincle (talk) 21:36, 22 April 2019 (UTC)Reply

Yes, I think they should be included. (Also, the "dump-torrents" project on tools.wmflabs.org claims to have sha1 and md5 checksums (example), but when I click the links I get 404 errors.) PiRSquared17 (talk) 21:51, 22 April 2019 (UTC)Reply
Guys, all checksums are available on the page of each of Wikipedia releases. Here for the english version: https://dumps.wikimedia.org/enwiki/ you'll find the checksums below the words "Dump complete", usually a few days after the dump is actually completed. 2803:9800:9017:4C35:C900:7DEA:81E6:4F5E 20:29, 12 March 2022 (UTC)Reply

Tutorial anyone?

edit

Can anyone knowledgeable please include a tutorial on how to create torrents? I'm wishful to download dumps and then share them here but I wouldn't know how to properly do it.

Blocked in UK or on Sky Broadband?

edit

Are these bittorrents blocked in the UK or on Sky Broadband? The servers are completely non-responsive for me. Both nicdex.com and litika.com have the same IP address, as does torrage.info (90.207.238.183) :/ Why aren't these torrent files on wikipedia itself?

Entire Wikipedia Including All History?

edit

I would like to download a copy of all of Wikipedia, preferably all languages, and complete with history and deleted pages in a way that I can stand up my local copy. How much storage do I need to get this up and running, and where do I get the data from, please? I have a Mediawiki server already, so something like "SQL including everything plus a media tarball" or so would be preferred. The description on the website https://dumps.wikimedia.org is a bit vague and suggests that the SQL dump only contains the SQL for one day. 2A02:C7C:399B:9700:77F6:96ED:55D3:23BB 22:08, 2 February 2023 (UTC)Reply

edit

Clicking on it, I get

No web-service

The URL you have requested, https://dump-torrents.toolforge.org/, is not currently serviced. If you have reached this page from somewhere else... This URI is managed by the dump-torrents tool, maintained by Legoktm . That tool might not have a web interface, or it may currently be disabled. If you're pretty sure this shouldn't be an error, you may wish to notify the tool's maintainers (above) about the error and how you ended up here. If you maintain this tool You have not enabled a web service for your tool, or it has stopped working because of a fatal error. Please check the error logs of your web service.

Playmobil111 (talk) 09:17, 13 February 2023 (UTC)Reply

edit

That site is full of popups, can't find the file. AlwaystheVictor (talk) 10:52, 24 July 2024 (UTC)Reply

I've started adding links for Academic Torrents, which should be more trustworthy. Willbeason (talk) 18:17, 3 September 2024 (UTC)Reply

Uncompressed size

edit

Do you think it'd be possible to list he uncompressed sizes? Especially for Wikidata as I know it's over 1TB and I genuinely don't know if I have the space to uncompress it lol Asdfghjohnkl (talk) 22:47, 21 December 2024 (UTC)Reply

Return to "Data dump torrents" page.