Community Wishlist Survey 2017/Wikidata

Wikidata
29 proposals, 231 contributors



Integrate Pattypan and Quick Statements

 
Mapping Commons Artwork template to Wikidata painting item
  • Problem: Currently, GLAM organizers face some hurdles to be able to submit simultaneously to Wikimedia Commons and Wikidata. This is particularly true for itens that should uncontroversially be on Wikidata, such as paintings (check Wikiproject Sum of All Paintings). Pattypan is a tool for mass upload to Commons, which relies on a spreadsheet; Quick Statements is a tool for mass upload to Wikidata, whose code can be generated from a spreadsheet. These tools, that have common goals --to ease massive uploads--, do not work too well together, and may be require from massive uploaders the double work of using one then the other, wasting precious time for work that could be done simultaneously.
  • Who would benefit: Massive uploaders, from Wikidata and Wikimedia Commons; institutional uploaders.
  • Proposed solution: The integration of Pattypan and Quick Statements seem like a technical small step, that would certainly require the creation of new columns associated to parameters one needs to fill in when working with Wikidata. As a pilot, I would recommend to strictly limit the integration to paintings. I understand GLAMpipe has been proposed as a solution for the issue of integrating Commons and Wikidata upload, but I believe this integration can also be set up for Pattypan.

Discussion edit

  • Endorse. This has been an issue since we started curating artworks on Wikidata. It is complicated by the ever-growing backlog of images of artworks on Wikimedia Commons that were uploaded with the default uploader, which includes an information template, but not an artwork or "art photo" template. Jane023 (talk) 11:11, 15 November 2017 (UTC)[reply]
    @Jane023: Hi, I wanted to let you know I've changed the {{support}} vote you added to "Endrose". The voting phase begins on November 27. Until then we don't want to confuse people into thinking votes will amount to anything. Sorry about that, and thank you for participating in the survey! MusikAnimal (WMF) (talk) 16:51, 15 November 2017 (UTC)[reply]
  • Endorse. I have been working with Quick Statements recently, and this integration with the mass commons uploading tools has turned a job that is already laborious into a three or more step laborious job. Ederporto (talk) 16:57, 16 November 2017 (UTC)[reply]
  • Endorse, as an author of Pattypan :). This idea is close to my heart as an avid Wikidata editor. Just to let you know, one of the things that is planned is early support for Commons' Structured Data. Yarl (talk) 19:32, 17 November 2017 (UTC)[reply]

Voting edit

Stable data

  • Problem: It is my firm opinion that the control of incoming data and of changes to the data stored in Wikidata does not work as well as it should. What is a tiny edit on one of Wikidata's ~ 40 million items can cause wrong information to appear in many thousands of pages in more than a hundred other projects, even those which chose to have flagged revisions as part of their quality control. There are many examples where the threat of vandalism not being detected is significantly higher than the possibility that the information provided actually needs to be changed (for instance: Chiapas is located in Mexico), some information is actually timelessly true (such as population numbers from a census at a specific date in time). Allowing such stable data to be actually stabilzed and only be changed under circumstances yet to be defined could not only increase Wikidata's reputation as trustworthy database but also increase its usage, while it lessens its vulnerability.
  • Who would benefit: Wikidata as a whole (reputation, usage), Wikidata volunteers, other projects' volunteers (less need to focus constantly on changes to Wikidata items), readers (don't get wrong information)
  • Proposed solution: Develop some kind of flagging single data on Wikidata (i.e. not the whole revision, but a specific statement)
  • More comments:
  • Phabricator tickets:

Discussion edit

Voting edit

Integrate Citoid Fully into Wikidata

  • Problem: One of the biggest challenges with the trust in Wikidata from existing community members in bigger language communtiies, like English, is lack of trust in the sources, and without full sourcing support, we can't provide the high quality references Wikipedians are used to.
  • Who would benefit: Reusers of Wikidata, and the Wikidata community.
  • Proposed solution: Integrate citoid into the tooling for Wikidata. This should not be too hard, because many of the source types have already been modeled in Wikidata
  • More comments: I recognize that this may be on the Development pipeline of the Wikidata team, but it appears to be competing with a number of other priorities, and most of the expertise for Citoid is at the Wikimedia Foundation

Discussion edit

  • @Mvolz: You may be interested in this proposal. Whatamidoing (WMF) (talk) 18:13, 7 November 2017 (UTC)[reply]
  • I really like this idea, seems super useful —TheDJ (talkcontribs) 11:51, 8 November 2017 (UTC)[reply]
  • Yes, it will also improve number of fully written references.--Nizil Shah (talk) 12:25, 14 November 2017 (UTC)[reply]
  • This is done in part through a Zotero translator that can create commands for QuickStatements (Zotero is the engine that drives Citoid); see d:Wikidata:Zotero. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:58, 15 November 2017 (UTC)[reply]
  • Yes, please help. Adding sources is slow and tedious and a real impediment to improving sourcing on Wikidata. StarryGrandma (talk) 03:31, 16 November 2017 (UTC)[reply]
  • I use the Zotero tool, but the hard part for books is still that the author and publisher should be Q Items, not strings, and we need to create both a “work” and an “edition” of that work before we can use it as a reference. A template interface that accepts an ISBN, looks for a match, and if it doesn’t find one creates both work and edition, interlinked, with tools for searching for the author and publisher in wikidata, would be ideal. It would also need to detect when the work exists but the not the specific edition indicated by the ISBN. I appreciate that this use case is hugely complicated. - PKM (talk) 04:47, 18 November 2017 (UTC)[reply]
  • Here's another approach, if what we want is to make it easy for editors to add references to books. (I don't think this is original but I have no idea where I saw it.) Add a new property "reference ISBN". It works like <reference URL> - user enters just the ISBN and qualifies with page number, section, etc. as needed. ISBN is clickable (to OCLC or some other source with book data). Companion bot follows behind - if the ISBN matches an Item, it adds the appropriate <stated in> reference. If the ISBN doesn't exist, bot looks for a match on author and title and creates an edition if found. If no match, bot creates minimal linked work and edition records (and adds them to a worklist for Books project or a new References project volunteers to review?). - PKM (talk) 19:23, 18 November 2017 (UTC)[reply]
  • This is very important for the use of Wikidata in enwp. Mike Peel (talk) 20:59, 18 November 2017 (UTC)[reply]

Voting edit

One click reference tool

  • Problem: Creating reference items on Wikidata is lengthy and complicated. As result it is rarely used and replaced by substandard referencing (stated in : item about database or reference url: bare url) or no reference at all.
  • Who would benefit: WikiData editors, WikiData users (including users of WikiData powered infoboxes)
  • Proposed solution: Create Citoid-like one click reference tool, that will create reference item automatically (+check if reference item for same source already exist).
  • More comments:
  • Phabricator tickets:

Discussion edit

I agree. Can I close this and we just focus on the other? Anything here you'd like to add to the other one? This will help get more votes in one place instead of having them split over two places. -- NKohli (WMF) (talk) 19:11, 21 November 2017 (UTC)[reply]
FWIW, I just read the other proposal before this one, and was ready to vote on this one although I didn't vote on the other one. I suppose it may be that the language of this proposal is clearer than the other one's regarding the benefits for editors like me. If that clarity isn't preserved through a merge (including possibly a more descriptive title), the total number of votes could end up actually reduced rather than increased. --Waldir (talk) 11:45, 3 December 2017 (UTC)[reply]
This one is a bit broader. Yes, Citoid may be the one click reference tool, but it can be other tool as well. As mentioned above Citoid may be unknown even for experienced editors, so they do not know its functions.--Jklamo (talk) 16:48, 6 December 2017 (UTC)[reply]

Voting edit

Interface to easily add pages linked through identifiers as a "reference" to a statement

  • Problem: Many items have large number of identifiers pointing to information in other databases. Those links often serve as references to statements in Wikidata. There should be a way for a user to quickly add specific identifier as a references to statement, without typing, cutting and pasting from multiple places.
  • Who would benefit: Wikidata maintainers and users
  • Proposed solution: There are several options: like build it into the wikidata interface, write a new gadget or extent existing gadget.
  • More comments:

Discussion edit

  • I think a bot could help with this. If for example I add <stated in> = "GNIS" to an item, the bot would automatically pull the <GNIS ID> for the item and append it to the reference. If the reference is to a GNIS ID other than the one for the item, of course the editor would need to manually add that ID. This would work for any identifier. - PKM (talk) 21:04, 15 November 2017 (UTC)[reply]
I am fine with a bot to fix incomplete references but even adding <stated in> = "GNIS" takes some cutting and pasting. Also unless I am running the bot, I like to leave a page in a state I would like to find it if someone else was editing it. --Jarekt (talk) 21:23, 15 November 2017 (UTC)[reply]
it would be nice to be able to drag-and-drop an identifier to a statement and have it added as a properly structured reference. I’d also like to be able to drag-and-drop a <described by source> or <described at URL> statement and have it “dropped” as a properly structured reference. - - PKM (talk) 04:30, 18 November 2017 (UTC)[reply]

Voting edit

Qr codes for all items

  • Problem: current QR coding requires copy/paste of individual urls of a primary language wikipedia from there down load the create png files to a hard drive for using in the artwork of the signage. Then if an article gets renamed the QR code becomes broken incuring a cost to the end user to recreate the signage which can be significant. It also impacts on the trust and relationships necessary for the long term support required
  • Who would benefit: affiliates making wikitowns, and qr projects, also enable GLAMs to use the qrcodes to access Wikimedia information directly in displays.
  • Proposed solution: create a bot to make qr codes for all
  • More comments: by using Wikidata items, they are static and dont change. its able to access all associations both internally and external in available languages, on the subject which expands WD role as a central data hub. This would also facilitate the creation of WMF based QR reader that will mean people wont be using commercial ad based services and keep the user within the safe WMF environment which is a plus for schools. Additionally it can use Wikivoyage journeys, Wikisource material as part the whole experience.

Discussion edit

There are things here: one to make a QR code for a Wikidata item (phab:T65463); the other to be able to download a batch of these en masse. That's right isn't it? Sam Wilson 10:16, 20 November 2017 (UTC)[reply]

yes and no if the qr code is created by a bot automatically and retained as part of a data item then its just becomes possible to down load the codes as needed Gnangarra (talk) 10:21, 20 November 2017 (UTC)[reply]
  • I missed the deadline (I wanted to support it) but I would like to add something. This proposal would be extremely useful for non-Latin languages where QRpedia codes are not usable. Here is a real-life example: File:QRpedia code in Odessa - Bristol Hotel - 2.jpg. This is a good code but not quite usable. What we would need is a code a) that would definitely keep link with an item even if the item is merged, b) allows to choose user's default device language (if available) or otherwise fallback to a pre-defined local language (e.g. show a page in French in France if the language of device is not available). As far as I can see these codes would be way easier than the current ones — NickK (talk) 18:15, 11 December 2017 (UTC)[reply]

@Liuxinyu970226, David1010, Thomas Obermair 4, Bspf, Donald Trung, TheDJ, Giovanni Alfredo Garciliano Diaz, Daylen, Gikü, Walter Klosse, Theklan, Gnangarra, BugWarp, Iliev, Samwilson, and NickK:
I made a tool for this: https://tools.wmflabs.org/portal/ , see Free Knowledge Portal for further details or to discuss it. - Evad37 (talk) 00:35, 11 February 2018 (UTC)[reply]

Voting edit

Expand automatic edit summaries

  • Problem: When one't watchlist is set to display edits made on linked statements on Wikidata, they are always displayed in numerical code even if labels exist on the Wikidata entries. For example, this diff on enWikipedia's watchlist displays as "Created claim: Property:P4552: Q5456; Added reference to claim: Property:P4552: Q5456" whereas on Wikidata it's two diffs with two edit summaries, "Added reference to claim: mountain range (P4552): Andes (Q5456)" and "Created claim: mountain range (P4552): Andes (Q5456)".
  • Who would benefit: People who use their watchlist on a non-Wikidata project to monitor changes to the Wikidata item linked to an article they have watchlisted. On enWikipedia some templates draw information from Wikidata so making it easy to monitor the edit content may be beneficial.
  • Proposed solution: The watchlist should display the language label if it does exist in lieu of the numerical code; in this case the summary should be "Created claim: Property:mountain range: Andes; Added reference to claim: Property:mountain range: Andes" perhaps with the "property" omitted if it makes the summary overlong.
  • More comments: I hope I didn't send this in too late.
  • Phabricator tickets: phab:T108688; phab:T171027 may be worth paying attention to since it's a technical issue that could impact on this project.
  • Proposer: Jo-Jo Eumerus (talk, contributions) 10:31, 20 November 2017 (UTC)[reply]

Discussion edit

Voting edit

Display reference in edit summary when a reference is added

  • Problem: The edit summary for this diff does display that a reference was added, but not which reference it is. References can be unreliable, spam links etc. so having them be easy to monitor is desirable.
  • Who would benefit: People who patrol Wikidata items for problematic edits, since the content of the diff is immediately displayed.
  • Proposed solution: Add the content of the reference to the edit summary; in this case it would be "Added reference (imported from:English Wikipedia) to claim: mountain range (P4552): Andes (Q5456)"
  • More comments: I hope that this isn't too late. This feature would be useful as well if it displayed in crosswiki watchlists. There may be length issues when the reference is long.
  • Phabricator tickets:

Discussion edit

Voting edit

Recognize .djvu file as wikisource index file

  • Problem: Creating texts on Wikisource is a multi-step process. The biggest and most important step is probably just getting the source file onto Commons. Then there are several more steps to get the file into Wikisource (in whichever language). Currently on Wikidata, in the P18 property for image files, the file is recognized automatically while the user types in the Commons filename. There should be a similar effect in reference urls if the file is on Commons and has been added to any Wikisource project. See e.g. these two items, that both reference articles from the same dictionary of biography that is currently on English Wikisource: d:Q38103276 and d:Q43194364. Currently only the first one has an associated article from the Wikisource file at d:Q38103904.
  • Who would benefit: Anyone contributing to Wikidata and adding references from Commons or via Commons, Wikisource.
  • Proposed solution: Make the Commons link obligatory when adding Wikisource articles to Wikidata so that they can be "recognized" on the basis of the .djvu filename. This will dissolve the language silos that keep this information unobtainable to reference-contributors.
  • More comments:
  • Phabricator tickets:

Discussion edit

  • @Jane023: maybe I miss your point but there is something strange with your links to Wikidata, could you check them? (d:Q38103276 and d:Q43194364 are not biography entries but the actual people Q5 and I linked Mary H. Graves - the woman, to Mary H. Graves - the dictionary entry)
That is by design: those items reference the same djvu file used on Wikisource. Jane023 (talk) 11:38, 30 November 2017 (UTC)[reply]
I didn't ask for anything to be obligatory. I would like the links to be recognized for what they are (like sitelinks are recognized when you type them in). Jane023 (talk) 12:29, 2 December 2017 (UTC)[reply]

Voting edit

Have calendar converter for input dates

  • Problem:
     
    Wikidat calendar
    Now the only calendar for inputting data in Wikidata is the Gregorian calendar. there are some other local calendars which are supported by MediaWiki at here (Arabic, Persian, Hebrew, Thai solar, Minguo, Japanese nengo). some article's data are base on these calendars and users should convert them to Gregorian to input them at Wikidata at it is difficult.
  • Who would benefit: At Wikidata local users can edit simply the dates base on their local calendar
  • Proposed solution: Develop a gadget which converts these calendar to Gregorian to input them at wiki data. I didn't propose to change the database date format I only suggested a converter gadget.
  • More comments:
  • Phabricator tickets:

Discussion edit

I think it would be better to allow data in different calendar being input into the form as it is instead of having a program to convert them? Some calendars are used differently in different era and people in different place also follow different rules that it might cause some troubles in conversion....C933103 (talk) 09:05, 10 November 2017 (UTC)[reply]

In my opinion, have a unique input type for the database is a good idea having different calendar make some difficulty for bots and external data users and I support a simple converter.10:10, 10 November 2017 (UTC)
Both your views can be conciliated, it only requires that both data to be stored side by side. From a traceability point view, it's clearly better to keep the reported data as close at possible to the source it come from. Calendars are a difficult topic, usage across documents are nowhere close to a linear synchronized trustful information. On the other hand, simple converter which feed a separated field provide a convenient way to build some interesting aggregation which are aware and fine with such a naive approach, which have both its pros and cons. So it would be more prudent to expose dates in several flavour of accuracy with explicit qualifiers for each. --Psychoslave (talk) 10:37, 12 November 2017 (UTC)[reply]

I think this is a great idea and could make editing simpler for users not used to the Gregorian calendar. NMaia (talk) 16:57, 10 November 2017 (UTC)[reply]

+1 on NMaia, this is definitely something that needs some love from some programmer(s).--Sannita - not just another it.wiki sysop 10:59, 13 November 2017 (UTC)[reply]

I do not like the idea of storing the dates differently based on a calendar. A point in time is a point in time no matter which calendar it was specified in by the source. Of course a reference need to specify the calendar the date was specified by the source. So I support user:Yamaha5 idea of date input in other calendars, but oppose the idea of storing dates in multiple formats side by side, as it is done in d:Q22687867 or d:Q165671. --Jarekt (talk) 18:44, 15 November 2017 (UTC)[reply]

If the proposal were accepted, the developers should carefully determine the earliest past date and the date furthest in the future for which the converter can convert accurately. The converter should refuse to convert any date outside the range for which the converter has been rigorously tested. This will require an understanding of history as well as computer programming. Jc3s5h (talk) 22:36, 15 November 2017 (UTC)[reply]

now MediaWiki supports (at the table see Non-Gregorian calendars subtitle) this conversion and it only needs to use MediaWiki codes!5.53.51.163 04:47, 21 November 2017 (UTC)[reply]

Voting edit

Create new item based on another item

  • Problem: Every time an item is created, we have to remake by hand all the information. We could use another item as a template when creating, cloning it and then changing the relevant information (like, for example, coordinates).
  • Who would benefit: Wikidata users that don't use extensively bots or tools. Data would be more coherent in this way, also.
  • Proposed solution: Having a clone button on each item.
  • More comments:
  • Phabricator tickets:

Discussion edit

There is a script for this: d:User:Magnus_Manske/duplicate_item.js. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:49, 27 November 2017 (UTC)[reply]

In fact I would prefer to have template, but this can be a first solution. --Dom (talk) 19:35, 30 November 2017 (UTC)[reply]

Personally I agree with Dom. We can go with some basic templates. Copying the whole item can be a disaster with new editors. --Nahid Hossain (talk) 00:38, 3 December 2017 (UTC)[reply]

Voting edit

Show Interlanguage links on some pages automatically

  • Problem: Sometimes, the user needs a copy (in another language) of a page (like talk pages of d:Q11214943) but this page is not important according to d:Wikidata:Notability
  • Who would benefit: All
  • Proposed solution: Show Interlanguage links on these pages automatically:talk pages (Depending on the original page item), mediaWiki pages, userpages, creator pages (depending on d:Property:P1472), and special pages
  • More comments:

Discussion edit

Use old style interwiki link?C933103 (talk) 03:43, 12 November 2017 (UTC)[reply]

@C933103:No, but make them appear automatically --ديفيد عادل وهبة خليل 2 (talk) 08:45, 12 November 2017 (UTC)[reply]
So how?C933103 (talk) 20:57, 12 November 2017 (UTC)[reply]

Wouldn't it be better to change the notability standards of Wikifata then? Personally I would say that we should abolish the notability standards as it leads to conflicts like the above, but I don't see why local linsks 🔗would solve this issue. --Donald Trung (Talk 🤳🏻) (My global lock 🔒) (My global unlock 🔓) 13:14, 29 November 2017 (UTC)[reply]

Perhaps this is slightly different, but I often want to read the same searchword pages in other languages, as content differs between Swedish, German and English articles on the same topic. Wich is fine, as for example historical persons, are relevant in different ways depending on your national / cultural viewpoint. But if you have the language skill, reading articles in other languages, often broadens your outlook on things. — The preceding unsigned comment was added by Bertiloman (talk) 09:43, 10 December 2017 (UTC)[reply]

Voting edit

Automatically follow the target items of statements and the property that gets used

  • Problem: Central pages on Wikidata that are linked from many places and vandalism affects many users don't have enough watchers. This means vandalism doesn't get reverted as fast as if more people would have the items on their watchlist.
  • Who would benefit: Any Wikidata user and data reusers who want that Wikidata is faster at reverting vandalism.
  • Proposed solution: If I add the statement "Earth (Q2) highest point (P610) Mount Everest (Q513)" in addition to adding "Earth (Q2)" to the watchlist "highest point (P610)" and "Mount Everest (Q513)" should also be added to the watchlist. Of course, there should be an option in preferences to deactivate this feature.
  • More comments:

Discussion edit

Nice idea, but this seems to me like it should be opt-in. Not everyone who uses Wikidata wants to spend time fighting vandalism - and I imagine it would be kind of strange to see that multiple pages have been added to their watchlist when someone chooses to watch just one page - especially as a default behaviour. -- numbermaniac 07:00, 9 November 2017 (UTC)[reply]

Fighting vandalism isn't the only effect of following central pages. It also means that it's possible to have conversations about them. Currently, it's hard to get a conversation between different people who use a property like "highest point" to clarify it's usage, but I'm open about whether or not this is enabled-by-default. Ideally I would like to see a A/B test for whether it makes sense to enable or disable such a feature by default. ChristianKl (talk) 16:07, 9 November 2017 (UTC)[reply]

Voting edit

Semi-automatic description generation

  • Problem: Currently, many descriptions created by bots can be derived and introduced logically, by using P31 and other properties (e.g. "Wikimedia disambiguation page", "Wikimedia category", "scientific journal", "scientific journal article", "species of animal", "protein-coding gene in the species Homo sapiens", etc.). In many case, once bots have created descriptions, the items escape the control of bots, so now it is very troublesome to add a translation in the new language to all of the existing items, or to delete descriptions in all languages when they are logically wrong.
  • Who would benefit: All editors, especially using minor languages.
  • Proposed solution: Triggered by the creation or updating of P31 (and other utilized properties) claim, database system generates the default description according to the centrally managed manually defined patterns, for each language. If the description is empty, system shows the default description. If a pattern for the corresponding language is undefined, system shows nothing.
  • More comments: With this new function, descriptions in Wikidata will be easily maintainable.
  • Phabricator tickets:

Discussion edit

Voting edit

Create a new class of statements which are automatically generated based on a query

  • Problem: Many properties related to an item are stored in other items and are presently very hard to access by templates using wikidata. For example:
  • taxon items have property parent taxon (P171). If I have an infobox that shows genus, family or order of a given organism I need a way to move up a chain of P171s until some rank is met.
  • people items have property place of death (P20) that stores most exact item related to place of death, which could be a house, street, hospital, neighborhood, etc. If I have an infobox that shows place of death of a person I usually need city or town where someone died. A query to look up a city of death of Pyotr Tchaikovsky is SELECT DISTINCT ?city { ?city ^(wdt:P20/wdt:P131*) wd:Q7315; wdt:P31/wdt:P279* wd:Q515 . }. It is very hard to access that information using Lua calls and it is totally not accessible through {{#statement:...}} calls. Similar issue would be for "Country of birth" or "Country of death".
  • We have several properties which have inverse constraint, for example mother (P25) / child (P40). We could retire child (P40) property and automatically calculate it from mother (P25) property. That would allow us to keep the information only in one place.
  • Who would benefit: users of the wikidata, infobox writers, maintainers of the wikidata
  • Proposed solution: Create infrastructure to allow read-only properties which are not directly editable but precomputed based on some SPARQL query and other properties and items. Users would see and access them in a way similar to the current properties.
  • More comments:

Discussion edit

  • This sounds like adding a reasoning layer to wikidata query service. Maybe should be a separate instance, but in principle this sounds like it could be a good idea. Or by pre-compute do you mean to re-compute these things every time something changes? That might be a lot trickier (do you update every location in a country if the country name is changed?) ArthurPSmith (talk) 20:22, 15 November 2017 (UTC)[reply]
    I do not know how often one would have to refresh such pre-computed statements, but whatever it is I am sure it will be more efficient than the current state of infobox templates doing in Lua operations equivalent to SELECT DISTINCT ?city { ?city ^(wdt:P20/wdt:P131*) wd:Q7315; wdt:P31/wdt:P279* wd:Q515 . }, just to get a city of death. I was looking into doing it for the c:Module:Creator infobox I maintain, and was advised by others that already implemented it, but with potential of loading multiple items to get this one piece of information, I figured out that there has to be a better solution. --Jarekt (talk) 21:31, 15 November 2017 (UTC)[reply]
  • I don't know that we need a new class of statements--what we would need would be to be able to say "this property can be followed" in the data exposed to the API, have the API follow it, until such time as it cannot be followed or it hits some predefined limit (by an editor maybe), or some such. --Izno (talk) 03:41, 16 November 2017 (UTC)[reply]
  • @Jarekt:, the problem here is that we're unlikely to have a Wikidata Query Service cluster powerful enough for what's proposed here until late 2018, so everything involving SPARQL queries falls outside of the scope of this proposal. See also my rejection of a somewhat related proposal here - SPARQL queries can be very slow and it's completely out of question that we allow them to slow down page viewing/editing. I'm not archiving this proposal, however, to see if a limited simpler solution that doesn't involve WDQS can be viable. Max Semenik (talk) 09:01, 24 November 2017 (UTC)[reply]
Max Semenik I understand, that the proposal might not be technically feasible, but if nothing else we can figure out how many people think it is a good idea. I was also imagining that the values would be precomputed and stored or cashed. I was also imagining that it might be saving time, because when I was asking about how to access city of birth using Lua I was told about modules that already do equivalent of SELECT DISTINCT ?city { ?city ^(wdt:P20/wdt:P131*) wd:Q7315; wdt:P31/wdt:P279* wd:Q515 . } in Lua. However since that would require loading of many items for a single piece of information, precomputing seemed like more sane solution.

Voting edit

Stop making false claims about dates

Discussion edit

  • It is currently impossible for an editor to enter an accurate birth date for anyone who wasn't born where the offset from UT was 0 at the place of birth (the UK in winter, for example). It is beyond the ability of editors to fix this. The developers should allow dates in local time to be stored.
Let me give an example. I read in a reliable source that a certain person was born in Australia on July 15, 1975. The date of birth is given, but not the time of day. If the person was born at 1 AM local time, the UT birth date is July 15. But if the person were born at 11 PM local time, the UT birth date is July 16. The editor can't solve this, it's on the developers. Jc3s5h (talk) 13:09, 16 November 2017 (UTC) edited for clarity 15:11 18 November 2017 (UT)[reply]
  • This is a symptom of a more fundamental issue: it's not possible to store full datetime values in Wikidata, you can only store the date and it assumes that the time is 0h UTC. It would be much better to allow the full datetime to be edited, so that the hour/minute/second can be specified as well as the day. Thanks. Mike Peel (talk) 20:47, 18 November 2017 (UTC)[reply]
  • For most time properties we have, like date of birth or death, you are not going to find sources with dates more precise than a day. Some obituaries might have time of death but it would be provides in local time, with whatever time savings adjustment would be proper for the place and era. So a lot of dates even if you know the time would be hard to convert to Universal Time. --Jarekt (talk) 12:56, 20 November 2017 (UTC)[reply]
  • With some rare exceptions, all dates are timezone-less. We export them as dateTime, but in fact virtually all of them are just dates, with no time. Since base JSON model still pretends they are date-times so does RDF model, but maybe it's time to refine and change that. What I am absolutely opposed to and think is the worst thing we could ever do is start converting dates to "local timezone" and "UTC". That would lead to utter insanity - we do not have historical data on timezones, and even if we had and somehow managed to make all data conversions work (which we with 99.9999% probability can't) it would result in converting all our data to utter junk as nobody ever cared what was the date in Greenwich, UK at the time certain event happened - unless that event happened in Greenwich, UK. What everybody cares about is what the date was in the place where event happened, and that's the only thing we should ever deal with. There should absolutely be no such thing as "UT birth date". What we have now is we deal with dates right, but we record them wrong - we pretend as if they are date-times in UTC, where they are just dates, without time. That is something that we may want to change - and it probably requires wider discussion in the community. It's certainly not a developer question until we decide to remove the pretense of having "time" part from dates - at which point yes, the developers should take note.
However, this may make date-times and dates (if we ever have proper date-times) incomparable - which the effectively are, absent historic timezone data since beginning of the universe, but may be inconvenient for practical reasons. Or, we decide we give up on times altogether (given that we didn't use them for years now and still are fine) and eliminate times from the data model.
In any case, I support starting a proper discussion on this. Laboramus (talk) 21:05, 28 November 2017 (UTC)[reply]

Voting edit

Expand QuickStatements to allow wider variety of statements

  • Problem: QuickStatements is a vital part of a lot of work on Wikidata. It has a broad range of capabilities, however from time to time you run into types of statements that can not be added by QuickStatements. For example:
  • setting and changing ranks for statements
  • specifying calendar for a date statement
  • specifying precision or globe for a location statement
  • etc.
  • Who would benefit: Wikidata users who are using QuickStatements and would like to be able to do more with it.
  • Proposed solution: Expand capabilities of the tool.
  • More comments:
  • Phabricator tickets:

Discussion edit

If I remember correctly it's also impossible to add "no value" claims via QuickStatements, for example, South Pole: country: no value. Kaldari (talk) 18:44, 21 November 2017 (UTC)[reply]

Voting edit

Further functionality for "statement" parser function

  • Problem: When a page on another Wikimedia project wants to use data from Wikidata, bespoke Lua scripts are needed for anything more complicated than getting the main object of a property. This has led to a proliferation of Lua scripts that do the same task in different Wikimedia projects.
  • Who would benefit: Wikimedia projects that want to use Wikidata, particularly small Wikipedias
  • Proposed solution: Extend the current {{#statements:}} parser function to accommodate qualifiers and sources. This will cover most standard use cases of Wikidata in infoboxes.
  • Phabricator tickets:

Discussion edit

  • I would love to not only support, but actually work on this. Unfortunately this issues description is extremely vague. What exactly do you mean when you write "accommodate qualifiers and sources"? Should the parser function also output qualifiers, references, or both? Should it accept them as filters? How do you expect this to look like in wikitext? --Thiemo Kreuz (WMDE) (talk) 16:42, 28 November 2017 (UTC)[reply]
  •   Comment I doubt you can squeeze more out of {{#statement}}, but I think we should have more Lua libraries shared among all the projects allowing access to Wikidata statements. For example my c:Module:Wikidata date is used on Commons for formatting date statements in any language. It would be good to share such modules across other projects, so more people can improve it. --Jarekt (talk) 14:06, 4 December 2017 (UTC)[reply]

Voting edit