One Wikipedia Per Person
One Wikipedia Per Person (OWPP): Given currently existing technology, and technology that we can reasonably assume to be available within the next decade, how can the WMF best achieve its goal of giving every person free access to our current best summary of all human knowledge?
This article is a summary of existing ideas (original May 2009 foundation-l thread) and a testbed for new ones. You can help by adding new ideas and adding more information (such as global penetration statistics) to existing ones.
Offline Handheld Wikipedia ReaderEdit
A small device whose only purpose is to read Wikipedia. It contains all Wikipedias automatically translated directly from their source language to the single target language that the device is intended for under a CC-BY-SA license. There is one device per target language.
- Similar in size to a small cell phone
- Black and white screen
- First generation has no images
- Wind up/crank power source, similar to OLPC
- Large hard drive
- The encyclopedia has already been written and the requisite technologies already exist - the device could be assembled and given away now.
- Doesn't require internet access. Good since 75% of the world is not online
- Don't have to wait for the free market to realize the benefits of free internet access for every person
- Automatic translation allows even obscure languages access to Wikipedia (readability is not the same as ability to read)
- It could be very cheap. Creating 6.5 billion copies of a device takes advantage of the economics of scale
- Machine translation algorithms are published - don't have to partner with Google, but could.
- By targeting a device towards every single individual you can be sure that every person gets Wikipedia
- Requires an entire chain of collaborators: Google (for translation), hardware and software vendors, world governments (maybe?)
- Money. How much will it cost? Where will the money come from?
- Translation tech (esp. Google's) only works for a few dozen languages, whereas we currently have hundreds.
- Note however that sentence aligned source/target language corpuses can be any size. Smaller will have lower quality, but may still allow for the ability to read Wikipedia
- Google's system isn't open source
- But this doesn't appear to get in the way of our principles
Select the 40K most important articles, print one copy for every person in a given target language and then bring it to them.
- Each copy will be cheap
- An encyclopedia affords random information access which helps with building general knowledge of the world
- Can take advantage of existing textbook distribution methods
- Well developed supply chains and distribution routes
- Books don't stop working just because somebody broke Sat-3/WASC
- Less reliance on global infrastructure
- In some ways bad for the environment
- Disavows the benefits of digital information
- Majority of Wikipedia's don't have 40k articles. Still have to resort to automatic translation or hiring translators. (not nec. bad)
- Locales often have encyclopedias that may be better than Wikipedia for a given rare language
- Delivering books can be expensive
- Low quality TVs are now cheap
- A single channel can broadcast 5Mbps, or 52 GB/day
- Could use multiple channels, one for the static encyclopedia, one for live updates
- Handheld devices are too expensive
- Reliant on global infrastructure
- There simply isn't enough bandwidth - you'd have to constantly be streaming all of the data or could only broadcast a small subset
- TV is a broadcast medium. Most televisions have no way to send a signal
- TVs with hard drives are new in the developing world and non-existent in the undeveloped
- Only cable TV allows two way transmission