Wikidata:Contact the development team/Archive/2020/09

This page is an archive. Please do not modify it. Use the current page, even to continue an old discussion.

Fix typo request

I noticed that on Dagbani (Q32238), in the column language, it is wrote “dagbanli” instead of “Dagbanli”. It is possible to change? Thanks in advance!!! --151.49.93.113 03:34, 29 August 2020 (UTC)

Hi 151.49.93.113. We're currently looking into it. -Mohammed Sadat (WMDE) (talk) 11:18, 2 September 2020 (UTC)

Connecting newly created articles to existing objects resp. creating new object - additional step when creating articles, categories, etc.

Hello, the problem of connecting newly created articles to existing objects respectivley creating new objects for unconnected pages (when, by whom, how to avoid duplicates amongst the currently 93 million objects, ...) for hundreds of newly created articles per day in different language versions has been discussed for years now again and again without a solution. @ArthurPSmith: suggested at Wikidata:Requests for permissions/Bot/RegularBot 2 a possible solution: An additional step after saving a newly created article etc. to present to the user a list of possible matching wikidata objects (e.g. a list of persons with the same name; could be a similar algorithm as the duplicate check / suggestion list in PetScan, duplicity example) or the option to create a new object if no one matches. From my point of view, one current problem is, that a lot of creators of articles, categories, navigational items, templates, disambiguations, lists, commonscats, etc. are either not aware of the existance of wikidata or did forget to connect a newly created article etc. to an already existing object or to create a new one if not yet existing (which leads to a lot of duplicates, if this creation respectivley connection is not done manually, but by a bot instead, which have to be merged manually). Thanks a lot! --M2k~dewiki (talk)

Also, it would be helpful to have one page per WD WikiProject that gets automatically filled with items from that field (use WP categories to map if sitelink exists), and that have neither P31 nor P279 statements. --SCIdude (talk) 14:42, 31 August 2020 (UTC)
This would be a job for a bot operated by the community. I actually do have something similar in use for a single project. While I would not want to operate such a bot by myself, I would be willing to share my code if someone is interested (Python/pywikibot). —MisterSynergy (talk) 17:26, 31 August 2020 (UTC)
Also User:Pi bot operated by @Mike Peel: does a great job with connecting to existing objects or creating new ones if not existing for items regarding peoples (currently only for the english wikipedia, until june 2020 for about one year also for the german wikipedia, Thanks a lot to Mike! - In my opinion this should be reactived also for the german wikipedia). Of course, the algorithm could be improved, for example by also considering various IDs (like GND, VIAF, LCCN, IMDb, ...). The algorithm is described here: User_talk:Mike_Peel/Archive_2#Matching_existing_wikidata_objects_with_unconnected_articles. I have also created these two pages: User:M2k~dewiki/Tools/Create Objects and User:M2k~dewiki/Tools/Enrich Objects for the german language wikipedia. The first one might help to find and connect unconnected articles, categories, templates, ... from de-WP to existing objects respectivley to create new wikidata objects. The second one might help to enhance existing objects with IDs (using HarvestTemplates for GND, VIAF, LCCN/LCAuth, IMDb, ...) or other properties (e.g. using PetScan based on categories). Parts of the functionality of these two pages could be implemented in specialized bots. --M2k~dewiki (talk) 17:58, 31 August 2020 (UTC)
@M2k~dewiki: It could even be a step earlier - in at least some language wiki's when you do a search and there is no match, in addition to the red link to create a new article it lists matching Wikidata items if any - see here for example in Indonesian wikipedia. Could there be some way to encourage selecting one of those matching items as part of the create-a-new-item process, rather than just clicking on the red link? It would obviously require a significant UI change there, but then you have that link right at the start of creation, and could possibly pull in info (like id's) from the related Wikidata item immediately as part of writing the article, if that was wanted. ArthurPSmith (talk) 14:44, 31 August 2020 (UTC)
Also, if someone uses the "translation function" to create a translated article in another language version, then the new translated article could be connected automatically to the object of the original article. And after a version import (after a translation), at the moment often the link to the Wikidata object gets lost and the article has to be reconnected again a second time manually. --M2k~dewiki (talk) 14:50, 31 August 2020 (UTC)
@M2k~dewiki, MisterSynergy, ArthurPSmith: The only way this is going to work in the long term is that Wikipedia editors have to *want* to connect the articles to Wikidata, otherwise you're just going to ask them to extra steps for reasons they don't understand, and they won't want to do that extra work. I think the way to do that is to embed Wikidata in articles, and to do so in a useful way. We're getting there, but we're not there yet. Thanks. Mike Peel (talk) 21:30, 3 September 2020 (UTC)
@Mike Peel: Also see Wikidata:Client editing prototype / Wikidata Bridge; regarding embedding data in the german wikipedia also see de:Wikipedia:Umfragen/Normdaten aus Wikidata (de:Wikipedia Diskussion:Umfragen/Normdaten aus Wikidata) and de:Wikipedia:Meinungsbilder/Nutzung von Daten aus Wikidata im ANR. --M2k~dewiki (talk)

Distruptive edits by the iOS app

I wasn't aware of significant amounts of edits by the iOS app. An IP seems to create a lot of wrong edits of descriptions where the user who doesn't understand Wikidata's policies changed Wikidata's description. There seem to be multiple issues here.

1) The agreement with Enwiki is that the app should use Enwiki short descriptions and not Wikidata descriptions. It's unclear why the iOS app accesses English Wikidata descriptions. If the iOS app doesn't follow the agreement that set the policy for the Android app, that has a potential for creating conflict.

2) Apps shouldn't make anonymous edits to Wikidata.

3) When it comes to the Android app we talked 1-2 months ago about making it clear to the user that Wikidata descriptions are lower-case. If the iOS app wants to edit our descriptions it should follow along.

I don't know who's responsible for the iOS app. Can you forward the message? ChristianKl20:47, 2 September 2020 (UTC)

Hi Christian, thanks for reporting these issues, I will indeed get in touch with the iOS app team so they can have a look at it. Lea Lacroix (WMDE) (talk) 09:07, 3 September 2020 (UTC)
Edit: @Johan (WMF): Can you help with this? Thanks a lot! Lea Lacroix (WMDE) (talk) 09:58, 3 September 2020 (UTC)
Hi ChristianKl. Let me see if I can give some answers. First, to explain the logic behind editing in the apps, as we see it: At the core, the apps are typically just another interface to Wikipedia in particular and the Wikimedia wikis in general, and follows the same principles. Typically, there is little difference between the mobile web and the app in how you can edit the wikis, for example. I would consider this like any other unregistered editor, be it from desktop, mobile web or an app. Is there a particular reason why the app would be treated differently from e.g. mobile web here?
What we don't do is to invite people to edit (in the apps in the Android feed) unless they have a registered account, because we don't have the tools to handle that in a sensible way and it would overwhelm Wikidata quality control. That is handle according to a separate logic, as it encourages newcomers to make many edits, whereas the apps in general don't. I understand that what you linked to looks a lot like what you'd expect from a feed, but this is someone who has decided to go on an editing spree like one could do from any interface.
The iOS app always prioritises the local English Wikipedia description. It does, however, show the Wikidata description when no local description exists and what is shown is editable.
I will point out the iOS developers that the Wikidata description editing is very visible in the app, and even though iOS doesn't have the feed (the context we talked about a while ago), it could still be useful with a hint here. Thanks for mentioning it. /Johan (WMF) (talk) 17:58, 3 September 2020 (UTC)
See phab:T261977 for the iOS ticket. /Johan (WMF) (talk) 18:57, 3 September 2020 (UTC)
@Johan (WMF): the problem Christian was specifically referring to was the fact that the iOS app auto-capitalized descriptions against Help:Description#Capitalizaton. --Prahlad (tell me all about it / private venue) (Please {{ping}} me) 19:30, 4 September 2020 (UTC)
Reminds me of phab:T131013 (2016) --- Jura 05:20, 5 September 2020 (UTC)

zxx

Hello. Could you please enable the linguistic code for lexemes 'zxx' with the tagname 'no linguistic content'? I would like to use it for LATEX words/symbols under the category zxx-x-Q5310. See discussion here.--MathTexLearner (talk) 07:35, 6 September 2020 (UTC)

Thanks for your request. I continued the discussion here as I first need to understand why we need a code that means "no linguistic content" on a project that is specifically about linguistic content. I'd like to have input from the community about it. Lea Lacroix (WMDE) (talk) 09:20, 7 September 2020 (UTC)

Conversion to external-id

For Jewish Encyclopedia ID (Russian) (P1438) there is clear consensus for conversion, the only impediment were constrain violations, which I have now cleaned. Thank you in advance, --Epìdosis 11:33, 26 August 2020 (UTC)

Noted! Are there any other properties on the way to get a consensus, so we could run a conversion batch with 5 of them or more? Lea Lacroix (WMDE) (talk) 15:29, 26 August 2020 (UTC)
@Lea Lacroix (WMDE): I think it will be the only one for a while :( --Epìdosis 12:48, 30 August 2020 (UTC)
Alright! We'll try to make it happen as soon as possible. Lea Lacroix (WMDE) (talk) 09:55, 7 September 2020 (UTC)

June 31st

Hello, I've have found out by making a typo that the UI allows to save erroneous dates like 2020-02-31, 2020-06-31, etc. Shouldn't there be a control that prevents such errors? Ayack (talk) 11:59, 7 September 2020 (UTC)

As Christian mentions, the UI allows such dates because there can be a need for them (for example February 30 (Q37096) of February 31st being used for example on fictional tombstones). One of the principles of Wikidata is to allow exceptions and to not block them a priori. Maybe a constraint could solve the issue by informing the editor that the date they just entered is likely wrong? Lea Lacroix (WMDE) (talk) 14:15, 7 September 2020 (UTC)
It makes sense, but, yes, such a constraint could be useful. Thanks. Ayack (talk) 12:50, 8 September 2020 (UTC)

sr-el

sr-el looks like it's not working when added to user page so Serbian in latin script not working on items. Eurohunter (talk) 14:51, 3 September 2020 (UTC)

  Info The Serbian community members might want to rename this and sr-ec to use sr-Latn and sr-Cyrl in the future, though, I missed which Phabricator task this is related to. --Liuxinyu970226 (talk) 13:23, 4 September 2020 (UTC)
Hi Eurohunter, do you mean that when you add sr-el to your babel template it doesn't work? Or that it doesn't appear under your languages in the termbox? -Mohammed Sadat (WMDE) (talk) 08:08, 7 September 2020 (UTC)
@Mohammed Sadat (WMDE): It doesn't appear under languages when you want to add label or desription in any Wikidata item. Eurohunter (talk) 16:11, 7 September 2020 (UTC)
Eurohunter, Liuxinyu970226! We're going to look into it: phab:T262269. -Mohammed Sadat (WMDE) (talk) 12:56, 8 September 2020 (UTC)
@Mohammed Sadat (WMDE): Thanks. I can add that some languages also has no polish translations in so called termbox. Eurohunter (talk) 14:27, 8 September 2020 (UTC)
Can you please list those lanaguages so that we can take a look? -Mohammed Sadat (WMDE) (talk) 07:23, 9 September 2020 (UTC)

add language code "lij-mc" for monolingual texts phab:T254968 (September 8)

Could we move ahead with the above? It has been almost three months. For more samples, see https://w.wiki/bSd .

@Amire80, Mbch331, Lea Lacroix (WMDE): --- Jura 12:30, 8 September 2020 (UTC)

Thanks for the ping. I approved it in Phab. Sorry I missed the examples earlier. --Amir E. Aharoni {{🌎🌍🌏}} talk 12:35, 8 September 2020 (UTC)
Now there's approval I can make the patches. Mbch331 (talk) 15:49, 8 September 2020 (UTC)
And the patches have been submitted. Mbch331 (talk) 17:56, 8 September 2020 (UTC)
Thanks Amir and Mbch331! We will review the patches as soon as possible. Lea Lacroix (WMDE) (talk) 07:49, 9 September 2020 (UTC)

Due to some recent issue, I would let the development team notice a situation that exists since the very first day of Wikidata: When somebody failed to created an entity (including: label/alias or sitelink confilct, AbuseFilter, block, etc), a QID is lost forever. However, there are no any way to find the user doing so, and even if the user is found and blocked, QIDs will still be skipped when the user try to create an entity. Prior to phab:T213817, the issue is more severe as anyone can launch a DoS atteck without easy way from being detected. However, T213817 only mitigate the issue, but not completely resolve the issue (It can still cause unmanagable higher database load).--GZWDer (talk) 18:10, 9 September 2020 (UTC)

Thanks for reporting. I'll talk about it with the team and we will certainly add more details on Phabricator. Lea Lacroix (WMDE) (talk) 13:24, 10 September 2020 (UTC)

What’s your experience with reporting bugs or feature requests to the Wikidata development team?

Hello all,

Currently, various channels exist to report bugs or make feature requests related to the Wikidata software. You can for example leave a message here, create a ticket on our task tracking system Phabricator with the Wikidata tag. You can also ask questions on the various social channels run by the Wikidata community, like the Facebook group, Twitter, or the Telegram group.

We identified some issues with the current process from our perspective, and we would like to hear about your own experience, whether or not you already submitted a bug report or a feature request. After collecting all of this feedback, we will propose a reviewed and improved process for you to interact with the Wikidata development team.

You can find more information about the project here (current status, problems we identified, timeline). You are very welcome to give us feedback, either using this anonymous form, or answering the questions publicly on this talk page. This feedback loop will run until September 30th.

If you prefer giving feedback in person, we can also offer you a live call to talk about your experience! This call will take place on September 15th at 18:00 UTC on Jitsi.

Thanks for your attention, Lea Lacroix (WMDE) & -Mohammed Sadat (WMDE) (talk) 13:22, 10 September 2020 (UTC)

Query entities

Hello. I have seen that there are some properties that make use of SPARQL queries (Wikidata SPARQL query equivalent (P3921) and kinship equivalent in SPARQL at Wikidata (P4316)), however this seems very nonintuitive. Are there any plans to have more intuitive ways of storing and displaying queries in Wikidata?--MathTexLearner (talk) 23:16, 4 September 2020 (UTC)

Thanks for your question. Yes, there are plans to store and display Wikidata queries in new ways, although they are not concrete yet. Most of the plans are related to generating and maintaining lists from Wikidata on Wikipedia (automated list generation). Lea Lacroix (WMDE) (talk) 11:32, 14 September 2020 (UTC)

Apostrophe bug

Hello, a little bug I've just found: I searched "François Lemaçon d'Ormoy". Since there was no result, I clicked on the link below to create it, but it gave me a wrong label: "François Lemaçon d'Ormoy". Ayack (talk) 18:26, 17 September 2020 (UTC)

Thanks! I think that message is configured at MediaWiki:Search-nonefound, not directly related to Wikibase, so it’ll need to be fixed by the community if I’m not mistaken. (I’m not sure if it’s possible to fix it, but there might be some wikitext tricks I’m not aware of.) --Lucas Werkmeister (WMDE) (talk) 09:02, 18 September 2020 (UTC)
@Lucas Werkmeister (WMDE): The issue is not with the message. See [1] for example. It is with the proposed label in Special:NewItem. Ayack (talk) 09:43, 18 September 2020 (UTC)
@Ayack: I still think the issue is with the message. The message links to https://www.wikidata.org/w/index.php?title=Special:NewItem&label=toto%20d%26%2339%3BOrmoy, which produces the label toto d'Ormoy; I think it should instead link to https://www.wikidata.org/w/index.php?title=Special:NewItem&label=toto%20d%27Ormoy, which sets the label to toto d'Ormoy as it should be. (CC Matěj Suchánek, who created/edited the message on wikidata.org.) --Lucas Werkmeister (WMDE) (talk) 10:10, 18 September 2020 (UTC)
I thought we had fixed it long time ago. I'm not sure if these my changes were supposed to fix it: [2]. Do you, Lucas, happen to see how the message should be changed? --Matěj Suchánek (talk) 10:41, 18 September 2020 (UTC)
Very strange, on my sandbox that wikitext seems to do the right thing. Maybe the search page already escapes the search string before it’s passed into the message? --Lucas Werkmeister (WMDE) (talk) 13:18, 18 September 2020 (UTC)
Ah, yes – the text is already wikitext-escaped:
// If we have no results and have not already displayed an error message
if ( $num === 0 && !$hasSearchErrors ) {
   $out->wrapWikiMsg( "<p class=\"mw-search-nonefound\">\n$1</p>", [
      $hasOtherResults ? 'search-nonefound-thiswiki' : 'search-nonefound',
      wfEscapeWikiText( $term )
   ] );
}
I don’t know of a way to undo this in wikitext (though I assume you could selectively fix individual pairs using something like {{#replace:$1|&#39;|'}}); maybe we should add a second, unescaped message argument to the search-nonefound message, so that the Wikidata version of the message could use $1 when showing the search string but $2 when building the link. --Lucas Werkmeister (WMDE) (talk) 13:31, 18 September 2020 (UTC)

Deploy Citoid integration?

I noticed that on test.wikidata.org there you can generate automatic references, but this is not available here. Is there any time estimate when this will be deployed or are there unresolved issues? Best --Pyfisch (talk) 12:10, 17 September 2020 (UTC)

@Pyfisch: Thanks for your question. Indeed, the new Citoid features are not yet ready to be deployed on Wikidata.org. Unfortunately, we cannot give you an estimation for now, as it's depending on time resources. We will try to push this forward this year. Lea Lacroix (WMDE) (talk) 10:59, 21 September 2020 (UTC)

"suppress redirect" when moving pages not recognized as sitelink deletion by Wikidata, contrary to delete page (August 25)

Can we fix this? See Topic:Vsnby0dfotcfi23y, Wikidata:Request_a_query/Archive/2018/10#Identifying_interwiki_links_that_no_longer_exist, phab:T201371 (closed), phab: ?.. --- Jura 07:17, 25 August 2020 (UTC)

Hi Jura, is phab:T261275 a correct representation of the issue? Are we doing the right thing when "suppress redirect" is not used when moving to an unsupported namespace? Mohammed Sadat (WMDE) (talk) 09:30, 26 August 2020 (UTC)
  • Yes, thank you. I think it (phab:T261275) is different from the question it was merged into (phab:T231151). Maybe one solves the other, but I'm not entirely convinced.
What do you think @Wurgl, MisterSynergy, Mike Peel:? Jura 06:51, 29 August 2020 (UTC)

2020-09 URL datatype

Please take a look at Wikidata:Property proposal/Wikimedia Chat channel. Visite fortuitement prolongée (talk) 20:24, 21 September 2020 (UTC)

Display bug

At university (Q3918), clicking on the image for the sign language video leads to an error, since it tries to fetch a ".ogv.jpg" file that doesn't exist. I assume this may the case with all sign language videos or wider. Could it be fixed? {{u|Sdkb}}talk 19:04, 20 September 2020 (UTC)

Hi @Sdkb:, thanks for reporting. I just tried (with Chromium on Ubuntu), and it seems to work for me: when I click on the picture or the name of the picture, I'm correctly redirected to the file on Commons, and when I click on "Play media", the file is played directly on the page.
What browser are you using? Could you try with a different one? Lea Lacroix (WMDE) (talk) 08:09, 21 September 2020 (UTC)
@Lea Lacroix (WMDE): I'm on the latest update of Chrome. I just tried on my mobile device, and it comes up okay, but then when I check the "desktop site" box, I get the same problem where it goes to https://www.wikidata.org/wiki/Q3918#/media/File:220px--Csc-universitat-spreadthesign.ogv.jpg. {{u|Sdkb}}talk 15:58, 21 September 2020 (UTC)
@Sdkb: I have two user accounts, and I was able to reproduce the error you described using one of them but not the other. I made a test edit to university (Q3918) and the video works fine now for both users regardless of the browser type I used. It probably was a cache thing, but do you mind trying again let's see if it still persists for you? -Mohammed Sadat (WMDE) (talk) 16:39, 22 September 2020 (UTC)
@Mohammed Sadat (WMDE): Yep, works for me. {{u|Sdkb}}talk 16:48, 22 September 2020 (UTC)

Watch list on mobile device

Hello, has anyone noticed that the Special:Watchlist, viewed on a mobile screen has the items shown only in their Q, not their label ? Thus making this page user-unfriendly on smartphones. Bouzinac💬✒️💛 09:16, 23 September 2020 (UTC)

I guess this must be because the mobile watchlist is a separate special page which doesn’t run any of the hooks that Wikibase uses to format entity links using labels :/ it looks like T198807 and T233845 are two other issues caused by this technical split, and the epic T109277 would presumably resolve it, but feel free to create a dedicated task for this too. --Lucas Werkmeister (WMDE) (talk) 12:08, 23 September 2020 (UTC)
OK, phabricator >> T263633 Bouzinac💬✒️💛 12:43, 23 September 2020 (UTC)

flavor=dump duplication (August 12)

The other day I read that this is used for query server updates. Sample: https://www.wikidata.org/wiki/Special:EntityData/Q702232.ttl?flavor=dump

It includes some duplication not present on query server rdfs:label is repeated as skos:prefLabel, schema:name

Maybe that part should be skipped. --- Jura 07:09, 12 August 2020 (UTC)

Updated the above link to "ttl" (from rdf). --- Jura 15:10, 17 August 2020 (UTC)

@Lydia Pintscher (WMDE): what do you think? --- Jura 09:08, 19 August 2020 (UTC)

This format is used in other places than the Query Service, therefore changing it would be a breaking change. We don't see a strong reason to touch anything at this point. Lea Lacroix (WMDE) (talk) 10:24, 19 August 2020 (UTC)
There are two ways of solving this redundancy: either use a different function to update the query server or remove it from this.
Given that it's triggered by every edit, even a single additional element is likely to have a large impact. @Lydia Pintscher (WMDE): fyi --- Jura 10:38, 19 August 2020 (UTC)
@Jura1: The dump is proceeded via a munge process, which will remove skos:prefLabel and schema:name.--GZWDer (talk) 02:26, 25 August 2020 (UTC)
I doubt this applies to query server. In any case, even a single bit that is added and later removed is a waste if done for every single edit. --- Jura 07:07, 25 August 2020 (UTC)
This is part of the documented differences between the query service and the wikibase RDF model. As for why this data is duplicated in the RDF output I have no idea why such decision was made (closest I found is phab:T95316) but I suppose it was for better discoverability and/or compatibility concerns with existing RDF consumers that predated this RDF model. But since this RDF output is generated it is not really a waste on the servers (except for the storage required for the RDF dump files themselves). DCausse (WMF) (talk) 15:10, 28 September 2020 (UTC)
@DCausse (WMF): The question is primarily if it's a waste if that function is being used for the updates after every edit. For more general dumps, supposedly people want to do their own selection. --- Jura 09:32, 30 September 2020 (UTC)
The query service will fetch the data from the URL you pasted but the duplicated labels will be removed right before inserting the data into the graph database. This process (called munge) is very fast and should be negligible, in other words if wikidata stopped duplicating these labels I doubt we would notice the difference on the query service side. Semi-related: there were some discussions to do this munge process in wikidata itself by adding a new dedicated flavor for the query service which is I think close to what you suggest as it would lead to the same outcome (no more duplicated labels in the RDF output used by the query service). DCausse (WMF) (talk) 10:10, 30 September 2020 (UTC)

Update bug with ListeriaBot

Wikidata:WikiProject sum of all paintings/Top creators by number of collections should absolutely be updated because many new items have been created since the last time that was done (11 August 2020). But ListeriaBot simply won't do it! No matter how many times I launch the manually updating procedure, the bot gets stuck after a few seconds. I've signaled it in its talk page and also left a message to its creator, Magnus Manske. But as yet, to no avail. Can somebody fix that? Thanks! --Edelseider (talk) 11:44, 21 September 2020 (UTC)

Same problem with Wikidata:Lists/corona_virus_deaths#Detailed_list. --Jklamo (talk) 12:43, 21 September 2020 (UTC)
I just left a second message on Magnus Manske's talk page. He is around and active, let us hope that he will react sooner rather than never. --Edelseider (talk) 12:48, 21 September 2020 (UTC)
@Magnus Manske: is just ignoring us, for whatever reason. God bless him, I hope somebody else will help us out. --Edelseider (talk) 15:15, 21 September 2020 (UTC)
  • I think the plan is or was to replace Listeria with a WMDE tool. Accordingly, Magnus probably doesn't want to spend more time maintaining it. Maybe WMDE can give you an update on the remplacement. --- Jura 19:02, 23 September 2020 (UTC)
We do not have the plan to replace Listeria with a WMDE tool. We have a similar concept, automated list generation, that may partially overlap with Listeria, although it will probably be focused on generating list for the content of Wikipedia articles, while Listeria is mostly used for maintenance lists at the moment. Anyway, this project has not been started yet. I cannot speak for Magnus, but I'm pretty certain he would not give up on maintaining one of his most used tools without announcing it first. Let's give him a bit more time to reply and evaluate the issue. Lea Lacroix (WMDE) (talk) 09:59, 25 September 2020 (UTC)