T21986 [DO NOT USE] Wikis waiting to be renamed (tracking) [superseded by #Wiki-Setup (Rename)] (original) (raw)
"3. Change the "traditional" MediaWiki interwiki prefix (not so important because Wikidata has made that mostly obsolete)"
That's wrong, we very frequently need interwiki links within articles, or in many discussions, translation requests when we cannot link to an article on the local wiki.
It is not just for linking a local article to other versions (not really translations) for the same topic. Note that many articles may be splitted with subtopics in one wiki but not in another, so interwiki links from Wikdiata will not match exactly (and there's not always an exact matching anchor in the target article in another language). Wikidata only works for 1:1 relations between topics, not for 1:N or N:1. The organization of articles do not follow the same schemes between wikis that have different conventions, even if there's a common base of topics that are easily identifiable by a name.
Project pages on each wiki rarely connect well across wikis because each wikiproject has different teams working on them. Still we need to cross references "similar" or "related project pages acrtoss wikis, and cite many external links to other wikis.
As well, user pages (and their talk pages) are not found in Wikidata, but users want to link severtal accounts not always unified with SUL across wikis. We need interwiki prefixes to specify the target wiki or to inform other people to take contact on a specific wiki.
But in fact adding the support for new interwiki prefixes (or to alias them so tat a new one is prefered) requires no big database maintenance: it's very basic configuration (simpler than the domain name on Wikimedia DNS servers, as it requires also updating certificates for HTTPS). It can be the first thing to do
Then there are some essential tools to check or reconfigure (e.g. the Universal Language Selector, the BCP47 code to use for HTML conformance, which is generated in HTML lang="" attributes), some modules to update (list of supported languages, directionality).
For MediaWiki itself, there's the need to alias its linguisitic resources (MediaWiki translation on translate.net must be informed, existing translations for Wikimedia must be temporarily blocked, then renamed or copied there with the new code, existing translators should still be able to translate for the new code.
There are several PHP config files to update, and all other wikis should be informed with a technical notifice. On each wiki there will be work to check in templates or Lua modules; as the transitionj will take time (months, but most probably years), this requires keeping aliases for the older code so that they continue working. Some wikis will require admins to check codes used within pagenames or in protected pages, so that pages can be renamed.
Renaming the internal database is largely invisible except for those that download snapshots to create mirrors, if they expect a specific filename matching the wiki code.
In summary, the simplest thing to update is:
- adding an interwiki and aliasing the former one, and checking that "#language:" correctly resolves both codes.
- makign sure that Wikidata will accept the new code
- updating the certificates for HTTPS so they support two domains (the former one and the new one) and installing them.
Then only we can:
- add support for a new subdomain and aliasing (CNAME) the former one.
- update the map of virtual hosts on webservers. Check the security issues which may exist when the same site will be available under two equivalent domain names simultaneously.
- reconfigure the front proxies and make sure they correctly resolve domain names to an appropriate webserver and that its virtual hosts will also accept connections with the new domain. These two steps may require putting a domain offline to run a test of connectivity, making sure that new certificates are also delivered.
Then update Wikidata with some bot (Wikidata should already only contain BCP47 codes, not old non conforming codes, but it uses non-conforming codes explicitly for the list of Wikipedia or Wiktionnary pages)
Ideally, before aliasing the old code, the new code should first be created as an alias, and then the aliasing reversed later: this will help updating and testing various utilities, templates and modules when they generate links. Some bots may incorrectly revert changes in pages/templates/modules by detecting errors if we don't use the canonical names: if we first create the new domain and alias the old one, these bots may start doing massive jobs, but it will interfere with other tests to perform first. It will be easier to detect these bots and block them/inform their maintainers, that the new code is valid and still being tested.. When we'll reverse the aliasing, there could be tons of reports in logs about usage of non-canonical codes: some tuneup may be necessary to filter these false alarms: cleanup of wikis is the last thing to do and should first start manually.