Shortcut: WD:PATROL
Wikidata:Patrol
This page is a work in progress, not an article or policy, and may be incomplete and/or unreliable.
Please offer suggestions on the talk page. বাংলা | català | čeština | Deutsch | English | español | فارسی | suomi | français | हिन्दी | magyar | italiano | 日本語 | 한국어 | Lëtzebuergesch | latviešu | македонски | Bahasa Melayu | မြန်မာဘာသာ | norsk bokmål | Nederlands | ਪੰਜਾਬੀ | polski | پنجابی | português | português do Brasil | română | Scots | shqip | slovenščina | српски / srpski | svenska | Türkçe | русский | українська | 中文(简体) | +/− |
Patrolling is the term used to describe when individual users are reviewing edits to check for inappropriate edits. This is performed in order to quickly undo undesirable edits, like spam and vandalism.
After reviewing a change, people can mark changes as having been "patrolled", which allows other people to skip that change and review another change. This allows people to coordinate their patrolling activity, such that all edits get checked at least once, but with less wasted effort (multiple people checking the same edit). Patrolling is usually available to all (auto-)confirmed users.
Edits of (auto-)confirmed users, including rollbackers and administrators, are automatically marked as patrolled. In the past, or in another wikis, there is a separate group of autopatrollers and patrollers; and in English Wikipedia, these groups only depend in new pages.
What to do
[edit]Patrolling primarily consists of a four step process:
- Identify "bad" or "needy" edits
- Patrollers efficiently identify problematic edits. A bad edit is an edit that for one reason or another may need to be entirely removed. A needy edit requires maintenance or improvement in some manner.
- Mark, remove or improve the edit
- Needy edits should be changed immediately. Bad edits should be deleted. All other edits should be marked as patrolled.
- Warn the editor
- In the cases of deliberate vandalism or an evident lack of knowledge on Wikidata procedure, offending editors should be warned on their talk pages. While this is an optional step, it should be a regular part of a patroller's duties, as it minimizes conflict, educates new editors and alerts administrators of repeat offenders.
- Check the user's other contributions
- You will often find more edits with similar problems. As a patroller, you may want to fix those as well.
"Bad" edits
[edit]Look for newcomer tests, but do not bite the newcomers. Revert their experiments and leave a messages on their user talk page.
Look out for vandalism, and revert it. It is often worthwhile to check the page history after reverting to make sure you have removed all the vandalism. Also, check the user contributions of the vandal – you will often find more malicious edits. If the vandal will not stop, list them at Wikidata:Administrators' noticeboard. Ensure that the user has been warned thoroughly before posting a notice on Wikidata:Administrators' noticeboard and has had time to read the warnings and still ignore them. If a user has not been sufficiently warned, or has only vandalized a couple of times an administrator may archive the notice without action.
Look for non-notable items. If an item does not meet the notability policy report it to Wikidata:Requests for deletions. Check the editing history: Sometimes non-notable items may be left as a result of incomplete merge; in this case, complete the merge.
Monitoring
[edit]The old school way is to load recent changes, hide patrolled edits and check the (diff) links. In the diff view, users see depending on their rights a link to undo, revert and patrol the edit.
The gadget Mark as patrolled is adding the patrol link to Special:RecentChanges making the click on (diff) obsolete in obvious cases.
Monitoring changed terms
[edit]Many users contribute good helpful labels, descriptions and aliases in various different languages, but a few users abuse these terms to add insults, spam or intentionally false information to items. Patrolling changes in a foreign language is a challenge but one can help by patrolling terms in ones own language.
PLtools Wikdata Recent Changes provides view of unpatrolled recent changes filtered by language. Click login to authorize the tool to patrol and undo changes in your name. Then select type of edits: terms and enter the code of the language you are interested in. Each line shows a changed term in the chosen language. Check if the new term is appropriate to the item, patrol good edits, change terms that need improvement (and patrol them afterwards), and undo vandalism and test edits.
Make sure you are familiar with the guidelines for labels, descriptions and aliases. If the user added a "bad" term, check their other changes and if necessary revert them. For descriptions that don't follow the guidelines you can notify the user by adding {{subst:uw-description1}}
to their talk page.
Monitoring suspicious changes
[edit]Certain patterns in changes are associated with vandalism and problematic edits. Some of these patterns are recognized by software filters and these changes are tagged:
- #possible vandalism
- Edits that were flagged by an anti-vandalism filter. Common swear words, nonsense descriptions and implausible values for claims trigger these filters.
- #possible test edit
- Edits that add very short or otherwise non-descriptive terms.
- #Emoji
- While there are legitimate reasons to include Emoji in terms they are commonly added by vandals and in newcomer tests.
- #new editor changing statement
- When the value of a statement was changed, make sure it is correct, in addition the sources associated with the statement must actually make the changed claim. If there are multiple values found in different sources, they should usually be added as separate statements.
- #new editor removing statement
- Often statements should not be removed but instead set to deprecated rank and sometimes editor remove correct information that should be restored.
All of these filters have false-positives, check edits carefully to determine whether the change is good or bad.