User Details
- User Since
- Oct 7 2014, 7:13 AM (527 w, 2 d)
- Availability
- Available
- IRC Nick
- siebrand
- LDAP User
- Siebrand
- MediaWiki User
- Siebrand [ Global Accounts ]
May 5 2024
A PR was created successfully from twn to gitlab.
May 4 2024
The project is now active on twn. Please keep the task open until the first successful exports. These should happen next Monday.
May 3 2024
I'm interested in helping this move forward.
Using the following command to run grunt:
I was able to set up a development environment with the help of https://www.mediawiki.org/wiki/Cli and some guidance from @Addshore.
Going to give this a try.
Mar 26 2024
Mar 6 2024
Nov 30 2023
Jun 17 2023
May 24 2023
Jul 12 2022
Aug 19 2021
Aug 6 2020
Can I dream of a somewhat more API based real-time integration?
AFAIK Django supports gettext. translatewiki.net supports gettext. So I don’t really expect issues. Regular i18n concerns will always apply. See https://mediawiki.org/wiki/Localisation with some MediaWiki specific hints, but most are applicable to i18n of any code base.
Jul 13 2020
Right. I was a little rusty and the managment console and account names had changed since I last used it. But I regained access, and I have removed the DNS zone for pywikibot.org and updated the nameservers to point to ns[012].wikimedia.org. This can take up to a day to trickle through. If DNSSEC should be configured, please let me know the digits.
@Legoktm : I'm on it. I apparently don't have access to the PC Extreme control panel of Wikimedia Nederland where I should be able to change the SOA. I've forwarded your mail to Kirsten Jansen k.jansen who works at wikimedia.nl. I'll keep you updated.
Jan 22 2020
Jan 6 2020
Jan 2 2020
Dec 29 2019
Would it be worth closing this issue as resolved, as Brazilian Portuguese is now available in Phab? Then for any remaining issues, one or more separate issues can be opened that contain more details not related directly to adding generic pt-br support.
Appears to have been done with https://github.com/Nikerabbit/vatkain/commit/805611526c15475e2bb441da969d3a6b3d9b40af
Dec 28 2019
As registered extension author, I approve this request.
Nov 22 2019
Script to convert the data downloaded by download_T235995.sh to CSV files with 500 items each, including some cleanup of descriptions.
Working on this. First step is to download the data. Attached is a script to scrape a generic OAI-PMH endpoint and save every batch in a uniquely named file. The basis of the script comes from https://wiki.lyrasis.org/display/DSPACE/OAI+XML+cache+warmup by Ivan Masar, with improvements by me.
Nov 6 2019
Sep 22 2019
Aug 15 2019
May 21 2019
I've retracted the patch to the language data on GitHub. There is too much debate on autonyms, and there is not the right place. All proposed autonyms here should first be fully sourced and approved by @Nikerabbit or @Amire80 before I'll create a patch.
May 19 2019
Just a reminder: I discussed the missing autonyms with @Nikerabbit, and we agreed that we shouldn't add incomplete language data. Please ensure there are verifiable/sourced autonyms for all language codes you would like added. At the moment, they are missing for rmc, rml, rmn, rmo.
May 18 2019
Submitted https://github.com/wikimedia/language-data/pull/53 containing source data that will be used in ULS.
Submitted https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/cldr/+/511048/ to complete adding the language codes mentioned in this ticket to the CLDR extension.