Community Wishlist Survey 2019/Admins and patrollers/Smarter undo function

Smarter undo function

  • Problem: When trying to revert vandalism or undo past edits which are problematic, there are still edits that cannot be "undone" which should be able to be undone. The software frequently refuses to use the undo function on edits that it should be able to recognize and undo.
  • Who would benefit: People reverting vandalism or undoing old, previously missed vandalism.
  • Proposed solution: I'm not very technical, but it seems to me that there should be a way to create a smarter function that recognizes better when a block of text added by an edit has no interlaced text from later edits that would conflict with removing it, without otherwise messing with the rest of the article. Currently, the "undo" function will bug out often even when the only changes to the article are to unrelated parts of the article.
  • More comments:
  • Phabricator tickets:

Discussion

I already mentioned and linked that subsequent diff in my comment. The point is that the system could get a little smarter in such cases. Regards, HaeB (talk) 09:22, 9 November 2018 (UTC)[reply]

Would Community Wishlist Survey 2019/Bots and gadgets/Machine readable diffs help here? If MediaWiki can generate diffs in a format it can easily understand, then maybe it becomes easier to make a finer grained undo that looks into each paragraph. MER-C (talk) 19:25, 15 November 2018 (UTC)[reply]

Agree with MER-C. The async wiki grant proposal I linked to from the machine-readable diffs discussion uses Git, which by almost all accounts does about as good a job as can reasonably be done of merging changes. If MediaWiki could do this natively, we'd all manually merge fewer edits, and all editing would be a bit faster and less frustrating. HLHJ (talk) 04:10, 22 November 2018 (UTC)[reply]
Git is actually not any better than MediaWiki, it basically merges two changes if they did not touch the same line, or neighbouring lines. That works pretty well for programming where lines tend to be the fundamental units of meaning, and it works poorly for wikitext. So one approach is making the diff algorithm non-line based - probably do some kind of sentence segmentation and then use sentences as the units of diffing. I think this is what HaeB was implying as well. --Tgr (talk) 04:52, 25 November 2018 (UTC)[reply]

Voting