Wikidata:Bot requests/Archive/2017/2

This page is an archive. Please do not modify it. Use the current page, even to continue an old discussion.

Adding French census data to Wikidata

Hi, (Info to @Zolo:, @VIGNERON:, @Oliv0:, @Snipre:)
I am looking for a contributor to load on wikidata the French census data by a bot from Excel tables that I would communicate. These tables contain the population data defined by the following properties: (INSEE municipality code (P374)), (population (P1082)) and uncertainty, but also by qualifiers characterizing these data: (point in time (P585)), (determination method (P459)) and (criterion used (P1013)) and two sources (that of the data themselves (INSEE) and that of the census calendar).

In France, the census is indeed now (from 2006) based on an annual collection of information, covering successively all the communal territories over a period of five years. Municipalities with fewer than 10,000 inhabitants carry out a census survey covering the entire population, one in five communes each year. Municipalities with a population of 10 000 or more, carry out a sample survey of a sample of addresses representing 8% of their dwellings each year. Each year there are three types of population values :

  • real populations (Q39825)
  • populations estimated by interpolation (Q187631) or extrapolation (Q744069)
  • populations estimated by sampling (Q3490295)

It's therefore necessary to load these qualifiers on Wikidata in order to correctly use the data. Loading only the population data would be insufficient.

There will be one set of data per years since 2006 (ie, 2006, 2007, 2008, etc. until 2013). The data for 2007 are here : https://www.dropbox.com/s/i6vs5ug64ls4upt/WD%20Communes-Populations%202007.xls?dl=0
Is there a volunteer ? Roland45 (talk) 13:04, 2 December 2016 (UTC)

Are the data with CC0 license? --ValterVB (talk) 13:16, 2 December 2016 (UTC)
It is published by INSEE under "Open Licence": see former discussion, also here and here. Oliv0 (talk) 13:22, 2 December 2016 (UTC)
You mean this Open licence? I read on the wiki page that "Information released under the Open License may be re-used with attribution, such as a URL or other identification of the producer". In my opinion it isn't compatible with CC0 and personally I never add data with a license different from CC0. --ValterVB (talk) 13:36, 2 December 2016 (UTC)
@Oliv0, Roland45: Just read this to understand that "Open Licence" is compatible with CC-BY and not with CC0. The author of "Open Licence" defines himself this compatibility so it is not possible to import in an automatic way the full dataset from INSEE. Snipre (talk) 17:04, 2 December 2016 (UTC)
Please read the links I gave: compatibility is clear, no need to discuss and hinder upload again. Oliv0 (talk) 17:08, 2 December 2016 (UTC)
@Oliv0: No links you gave is an official answer or comment from the author of the Open Licence, so unless one of the contributors you mentioned by your links takes the responsibility of his comments and is ready to go in front of a tribunal to defend its position, this is just words in the wind. The link I provide states clearly that the author of the Open Licence, Etalab, a French commission, defines its licence as compatible with CC-BY. In the text this is not confusing: "Selon la mission Etalab, la Licence ouverte / Open licence « s’inscrit dans un contexte international en étant compatible avec les standards des licences Open Data développées à l’étranger et notamment celles du gouvernement britannique (Open Government Licence) ainsi que les autres standards internationaux (Open Database Commons-BY, Creative Commons-BY 2.0)« ." Aucune mention à la licence du gouvernement américain ou à CC0. Merci donc d'apporter un commentaire official si tu veux continuer à soutenir ta position. Snipre (talk) 17:18, 2 December 2016 (UTC)
Please read the arguments in the links I gave "si tu veux" (if you want) to see unnecessary problems, you will see that the unanimous analysis made by the contributors was indeed what I said, and the possible compatibility with CC-BY is quite a different topic. Oliv0 (talk) 17:22, 2 December 2016 (UTC)
Quelle est l'autorité de tes intervenants ? Des gens qui contribuent pour la plupart sous pseudo, cela ne fait pas lourd face à un commentaire official (voir la description de la licence ouverte par l'Etalab sur son propre blog ici avec encore une fois un lien très clair entre licence ouverte et CC-BY est mentionné, et rien concernant CC0). Après on peut jouer sur la question de l'intégralité des données, du choix créatif ou non,... mais cela, c'est clairement jouer aux limites des licences. Au final, c'est à celui qui importera des les données d'assumer. Snipre (talk) 17:53, 2 December 2016 (UTC)
@Snipre:You mention : "it is not possible to import in an automatic way the full dataset from INSEE." But it's not the case. The Excel table that I propose is a reconstituted table that you will not find anywhere on the INSEE website. Each data has its own url (such as this one). If I only loaded one data (one line of the table), would it be not eligible too ? If you think like that, the most data of Wikidata are'nt eligible. And especially all datas of population which are yet online.Roland45 (talk) 17:38, 2 December 2016 (UTC)
To any bot operators who is interested in importing the data from INSEE under Open Licence, the author of Open Licence defines in its blog (see in French this page) the compatibility of Open Licence with CC-BY and didn't mention CC0. So importing data under Open Licence in WD leads to a high potential of non respecting the terms of Open Licence. Until someone provides an official comment from Etalab, the author of Open licence, defining the compatibility of Open Licence with CC0 licence, your responsibility is engaged. Snipre (talk) 17:53, 2 December 2016 (UTC)
In my respectful view, all the discuss (in French and in English) that are linked to this topic say the same things: it's OK to import. So I don't understand why there are still heated debates… Tubezlob (🙋) 18:05, 2 December 2016 (UTC)
The 2013 data have apparently already been loaded (but of course without the qualifiers mentioned above) with the following entries : imported from Wikimedia project (P143) French Wikipedia (Q8447) or stated in (P248) INSEE (Q156616), without any additional precision. I do not know if these imports have been done manually or "en masse", but is it better? Should'nt they be deleted ?Roland45 (talk) 18:10, 2 December 2016 (UTC)
@Roland45, ValterVB, Oliv0, Tubezlob, Snipre: « your responsibility is engaged » true but it's the same thing for every addition to Wikidata, including but not limited to, every import from Wikimedia projects (which is done daily). And as stated in the CC0 legal code « A Work made available under CC0 may be protected by copyright and related or neighboring rights ». @Roland45: connais-tu QuickStatements ? je peux t'en expliquer le fonctionnement pour que tu fasses l'import toi-même ;) Cdlt, VIGNERON (talk) 19:06, 4 December 2016 (UTC)
@Roland45, ValterVB, Oliv0, VIGNERON, Snipre: BnF authorities (Q20666306) is under Open License (Q3238028), and a bot (KrBot) imported a lot of data about persons directly from data.bnf.fr. So it's seems to be OK, no? Tubezlob (🙋) 14:17, 22 December 2016 (UTC)
@Tubezlob: it is « OK » to me. Cdlt, VIGNERON (talk) 14:29, 22 December 2016 (UTC)
@Tubezlob: Not for me, I think that "Open licence", like "Open Government Licence", fall under "Creative Commons Attribution (CC-BY) licence" so it is incompatible with "CC0 license". In case of uncertainty like this I prefer not to intervene. --ValterVB (talk) 09:18, 23 December 2016 (UTC)
To explain better: I need a lawyer/expert of licence who can confirm or deny whether the citation must also be maintained by those who use the Wikidata data or not. In the first case we can't use the data, in the second case we can use it (probably). --ValterVB (talk) 09:25, 23 December 2016 (UTC)
@Tubezlob: People are doing crazy, that didn't prove that they did good things. Snipre (talk) 14:56, 28 December 2016 (UTC)
@VIGNERON: Merci de m'expliquer où tu lis dans la ligne suivante qui est issue de l'organisme auteur la licence ouverte la compatibilité entre la licence ouverte et celle CC0: "Une licence (la licence ouverte) qui s’inscrit dans un contexte international en étant compatible avec les standards des licences Open Data développées à l’étranger et notamment celles du gouvernement britannique (Open Government Licence) ainsi que les autres standards internationaux (ODC-BY, CC-BY 2.0)." Cette phrase vient du site officielle de l'Etalab qui a écrit la licence ouverte pour le gouvernement. Bref, on y parle de CC-BY, mais pas de CC0. Il faut acheter où les lunettes spéciales pour lire CC0 sur cette page, parce que je suis preneur.
Le droit des bases de données est plus complexe que celui des objets isolés, mais une chose est sûre, une donnée seule et isolée n'est pas sous droit, par contre l'intégralité d'un ensemble de données extrait de manière systématique tombe sous le coup de la directive européenne des droits des bases de données (voir commentaire de la Fondation sur le sujet ici et en particulier la phrase Extracting and using a insubstantial portion does not infringe, but the Directive also prohibits the "repeated and systematic extraction" of "insubstantial parts of the contents of the database".
Moralité, tant que l'on se contente d'extraire de manière non-coordonnée les données et non dans un but systématique (en gros plusieurs contributeurs travaillant indépendamment et avec de petites quantité de données), on passe sous le coup de la directive, mais dès que l'on sort le bot, on change de niveau et on tombe sous le coup de la directive qui reconnaît des droits au propriétaire de la base de données. C'est pourquoi le gouvernement français a demandé à Etalab de fournir une licence pour les données de l'Etat français afin de faciliter l'utilisation des données qui sont protégées par la directive européenne. Cette licence permet de se débarrasser de la directive et d'une autorisation en bonne et due forme, mais sous les conditions de la licence ouverte, qui se définit elle-même comme compatible CC-BY. Voilà les faits issus d'organismes identifiables et habilités dans leur domaine, l'Union européenne, l'équipe légale de la Foundation et l'Etat farnçais via Etalab. A partir de là, chacun fait ce qui veut, mais on ne peut pas prétendre que c'est correct de transférer des données sous licence ouverte vers CC0, car 1) cela n'apparaît nulle par les documents officiels, 2) c'est ignorer délibérément la relation qui a été faite par l'auteur de la licence entre licence ouverte et licence CC-BY. Et c'est sur ce dernier point que je reviens sur la responsabilité, sur cet oubli volontaire de la pensée à l'origine de la licence ouverte. Snipre (talk) 14:56, 28 December 2016 (UTC)
@Snipre: tu t'attaches à la théorie et aux textes, je parle plutôt de pratique et de l'esprit des textes. Depuis la création et tout les jours, des imports sont faits depuis des sources qui ne sont formellement peut-être pas compatible, à commercer par les imports automatiques depuis Wikipédia sans que personne n'y voit rien à redire. Sinon, sur le plan légal et juridique, pour le côté droit d'auteur (mais le droit d'auteur s'applique aux œuvres, une donnée est-elle une œuvre ?) la paternité est toujours incontournable en France (et dans la plupart des pays du monde) donc la différence entre CC0 et CC-BY est quasiment inexistante juridiquement. De plus, les références remplissent un rôle qui me semble suffisant du point de vue de la paternité. Sinon, sur le côté du droit sui generi propres aux bases de données, là encore le problème se pose pour toute données qui sont pourtant importées quotidiennement sur Wikidata et dans le cas présent, il ne me semble pas que l'on prenne une part substantielle du jeu de données. Bref, on pourrait ratiociner encore longtemps mais l'import est en cours et je ne vois pas de problème, et de toute façon - si le cas échéant un responsable demande le retrait - il sera facile de supprimer les données concernées. Cdlt, VIGNERON (talk) 18:11, 28 December 2016 (UTC)
Mention of compatibility with one licence does not exclude compatibility with another one; compatibility with CC0 has been discussed and proved many times in the links given above, no need to discuss it again. Oliv0 (talk) 06:29, 30 December 2016 (UTC)

Why is importing CC-BY data a problem? As long as you indicate the source of the statements in Wikidata, the attribution clause of the CC-BY license should be satisfied, right? − Pintoch (talk) 08:51, 20 January 2017 (UTC)

When releasing data into Wikidata a user specifies that they have the right to release the data into the public domain (CC0). Reusers of Wikidata are supposed to be able use parts of Wikidata without carrying along all the sources. ChristianKl (talk) 20:23, 16 February 2017 (UTC)

Take care of disambiguation items

Points to cover

Somehow it should be possible to create a bot that handles disambiguation items entirely. Not sure what are all the functions needed, but I started a list on the right side. Please add more. Eventually a Wikibase function might even do that.
--- Jura 13:36, 18 April 2016 (UTC)

Empty disambiguation: Probably @Pasleim: can create User:Pasleim/Items for deletion/Disambiguation . Rules: Item without sitelink, with P31 that have only 1 value: Wikimedia disambiguation page (Q4167410). For the other point my bot alredy do something, (for my bot a disambiguation is an item with P31 that have only 1 value: Wikimedia disambiguation page (Q4167410)). Descriptions I use description used in autoEdit Label: I add the same label for all the latin language only if all the sitelink without disambiguation are the same. With these 2 operation I detect a lot of duplicate: same label+description. For now the list is very long (maybe >10K item) but isn't possible to merge automatically too much errors. Another thing to do is normalize the descriptions, there are a lot of item with not standard description. --ValterVB (talk) 18:02, 18 April 2016 (UTC)
  • Personally, I'm not that much worried about duplicate disambiguation items. Mixes between content and disambiguations are much more problematic. It seems they keep appearing through problems with page moves. BTW, I added static numbers to the points.
    --- Jura 10:06, 19 April 2016 (UTC)
    You will always have duplicate disambiguation items, since svwiki has duplicate disambiguation-pages. Some of these duplicates exists because they cover different topics and some of them exists since the pages otherwise becomes to long. A third category are the bot-generated duplicates. They should be treated as temporary, until a carbon based user has merged them.
    And how are un-normalized descriptions a problem? -- Innocent bystander (talk) 10:58, 19 April 2016 (UTC)
About "un-normalized descriptions": ex I have a disambiguation item with label "XXXX" and description "Wikipedia disambiguation", if I create a new item with label "XXXX" and description "Wikimedia disambiguation" I don't see that already exist an disambiguation item "XXXX", if the description is "normalized" I see immediately the the disambiguation already exist so I can merge it. --ValterVB (talk) 11:10, 19 April 2016 (UTC)
For some fields, this proved quite efficient. If there are several items that can't be merged, as some point, there will be something like "Wikimedia disambiguation page (2)", etc.
--- Jura 12:10, 19 April 2016 (UTC)

Lazy start for point (4): 47 links to add instance of (P31)=Wikimedia disambiguation page (Q4167410) to items without statements in categories of sitelinks on Category:Disambiguation pages (Q1982926): en, simple, da, ja, ca, nl, el, hr, sr, tr, eu, hu, no, eo, cs, lv, fi, hy, et, uk, it, mk, kk, pt, zh, sh, id, az, de, ro, sq, be_x_old, ru, be, fr, sk, pl, bs, ka, nn, ba, sv, la, sl, lt, bg, es,
--- Jura 12:07, 23 April 2016 (UTC)

The biggest problem is to define what pages are disambiguation pages, given names and surnames. For example Backman (Q183341) and Backman (Q23773321). I don't see what is the difference between enwiki and fiwiki links. Enwiki page is in category "surnames" and fiwiki page in categories "disambiguation pages" and "list of people by surname", but the page in fiwiki only contains surnames, so basically it could be in the same item as the enwiki link. --Stryn (talk) 13:10, 23 April 2016 (UTC)

I think people at Wikidata could be tempted to make editorial decisions for Wikipedia, but I don't think it's up to Wikidata to determine what Wikipedia has to consider a disambiguation page. If a language version considers a page to be a disambiguation page, then it should go on a disambiguation item. If it's an article about a city that also lists similarly named cities, it should be on an item about that city. Even if some users at Wikidata attempted to set "capital" to a disambiguation page as Wikipedia did the same, such a solution can't be sustained. The situation for given names and family names isn't much different. In the meantime, at least it's clear which items at Wikidata have what purpose.
--- Jura 14:20, 23 April 2016 (UTC)
You then have to love Category:Surname disambiguation pages (Q19121541)! -- Innocent bystander (talk) 14:35, 23 April 2016 (UTC)
IMHO: In Wikipedia disambiguation page are page that listing page or possible page that have the same spelling, no assumption should be made about the meaning. If we limit the content to partial sets whith some specific criterion we haven't a disambiguation page but a list (ex. list of person with the same surname list of people with the family name Williams (Q6633281). These pages must use tag __DISAMBIG__ to permit bot and human to recognize without doubts a disambiguation from a different item. In Wikidata disambiguation item are item the connect disambiguations page with the same spelling. --ValterVB (talk) 20:02, 23 April 2016 (UTC)

Disambiguation item without sitelink --ValterVB (talk) 21:30, 23 April 2016 (UTC)

I'd delete all of them.
--- Jura 06:13, 24 April 2016 (UTC)

Some queries for point (7):

A better way needs to be found for (7a).
--- Jura 08:07, 25 April 2016 (UTC)

I brought up the question of the empty items at Wikidata:Project_chat#Wikidata.2C_a_stable_source_for_disambiguation_items.3F.
--- Jura 09:39, 27 April 2016 (UTC)

As this is related: Wikidata:Project chat/Archive/2016/04#Deleting descriptions. Note, that other languages could be checked. --Edgars2007 (talk) 10:30, 27 April 2016 (UTC)

I don't mind debating if we should keep or redirect empty disambiguation items (if admins want to check them first ..), but I think we should avoid recycling them for anything else. --- Jura 10:34, 27 April 2016 (UTC)
As it can't be avoided entirely, I added a point 10.
--- Jura 08:32, 30 April 2016 (UTC)
Point (3) and (10) are done. For point (2) I created User:Pasleim/disambiguationmerge. --Pasleim (talk) 19:22, 2 July 2016 (UTC)
Thanks, Pasleim.
--- Jura 05:02, 11 July 2016 (UTC)
  • Matěj Suchánek made User:MatSuBot/Disambig errors which covers some of 7b.
    Some things it finds:
    • Articles that are linked from disambiguation items
    • Disambiguation items that were merged with items for concepts relevant to these articles (maybe we should check items for disambiguation with more than a P31-statement or attempt to block such merges)
    • Pages in languages were the disambiguation category isn't correctly set-up or recognized by the bot (some pages even have "(disambiguation)" in the page title). e.g. Q27721 (36 sitelinks) – ig:1 (disambiguation)
    • Pages in categories close to disambiguation categories. (e.g. w:Category:Set indices on ships)
    • Redirects to non-disambiguations. (e.g. Q37817 (27 sitelinks) idwiki – id:Montreuil – redirects to id:Komune di departemen Pas-de-Calais (Q243036, not a disambiguation)

Seems like an iceberg. It might be easier to check these by language and once the various problems are identified, attempt to sort out some automatically.
--- Jura 05:02, 11 July 2016 (UTC)

Note that my bot only recognizes pages with the __DISAMBIG__ magic word as disambiguations. If you want a wiki-specific approach, I can write a new script which will work only for chosen wikis. Matěj Suchánek (talk) 09:12, 12 July 2016 (UTC)
  • Step #4 should be done for now. The above list now includes links for 160+ sites.
    --- Jura 22:02, 5 August 2016 (UTC)
  • For step #3a, there is now Phab:T141845
    --- Jura 22:30, 5 August 2016 (UTC)
List of disambiguation item with conflict on Label/description --ValterVB (talk) 13:57, 6 August 2016 (UTC)
  • Added #11.
    --- Jura 02:05, 21 September 2016 (UTC)
  • Is it appropriate to add 12. Mix-n-Match should not offer disambiguation items for matching to external authority files. --Vladimir Alexiev (talk) 11:56, 21 January 2017 (UTC)
    • Sure, the list is freely editable, but the focus is mainly on how to handle these items rather than fix other tools. I wonder if things like Topic:Tjgt6ynwufjm65zk aren't just the tip of an iceberg with some other root problem.
      --- Jura 12:18, 21 January 2017 (UTC)

articles from Norwegian wiki added to wrong items

A bot of inactive user:Emaus, did some wrong edits like this one adding sitelink to no:Neoclitopa to Neoclitopa nitidipennis (Q14869209) when it should have been added to Neoclitopa (Q18115528). I collected bunch of them by hand but there is too many of them. we need to identify links like that and move them to proper item. I will try to write a query to identify them but can some help me with moving them? --Jarekt (talk) 13:20, 15 February 2017 (UTC)

SELECT ?item ?pItem ?taxon ?parentTaxon ?sitelink WHERE {
    ?item  wdt:P171 ?pItem .          # has parent item
    ?item  wdt:P225 ?taxon .          # taxon name
    ?item  wdt:P105 ?rank .           # taxon rank
    ?pItem wdt:P225 ?parentTaxon .    # parent taxon name
    VALUES ?rank {wd:Q7432 }          # restrict rank to species only at this moment
    ?sitelink schema:about ?item .
    FILTER(STRSTARTS(STR(?sitelink), "https://no.wikipedia.org/wiki/"))
    FILTER(STRENDS(STR(?sitelink), ENCODE_FOR_URI(?parentTaxon))) # norwegian article name matches parent taxon
    #MINUS{ ?item wdt:P225 ?parentTaxon . }
} LIMIT 100
Try it!
Here is an example of a query with some of the problem sitelinks. --Jarekt (talk) 13:45, 15 February 2017 (UTC)
Any Norwegian speakers to verify that those are a bad sitelinks? --Jarekt (talk) 13:51, 15 February 2017 (UTC)

I can try to check this up in a day or two. How many errors may it be, can it be repaired manually? (I guess we must, a Bot can not sort this out? Dan Koehl (talk) 21:04, 17 February 2017 (UTC)

Please note Wikidata_talk:WikiProject_Taxonomy#Many_bad_sitelinks_to_Norwegian_Wikipedia. --Succu (talk) 21:07, 17 February 2017 (UTC)

Lighthouses: import P625 from Commons

There are some 300 lighthouses with coordinates at Commons: https://petscan.wmflabs.org/?psid=677138

Somehow the PetScan option doesn't work for them. It would be good if these could be imported.
--- Jura 20:50, 17 January 2017 (UTC)

@Jura1: Both the import and the mentioned problem need attention but the latter is not obvious to me. Matěj Suchánek (talk) 15:09, 22 January 2017 (UTC)
@Jura1: I can do it using pywikibot--Mikey641 (talk) 16:41, 22 January 2017 (UTC)
@Jura1: OK so I'm probably gonna do it tommorow because since this morning I'm actually transfaring coordinates from hewiki to wikidata, so after I'm done I'm gonna transfer from commons--Mikey641 (talk) 18:29, 22 January 2017 (UTC)
This would be simple if their module added coordinates to the page info in categories as well (it does in files only). That's why it doesn't work in PetScan. Matěj Suchánek (talk) 13:28, 2 February 2017 (UTC)
Interesting. It might be easier to fix that then.
--- Jura 09:04, 26 February 2017 (UTC)

Cycle sport events: move claims from length (P2043) to event distance (P3157) and remove unreferenced bounds from values, if existing

I request a bot run to do the following:

  • In items which have ?item wdt:P31/wdt:P279* wd:Q13406554 (instances of subclasses of sports competition (Q13406554), so basically sports competition items) replace all length (P2043) claims by event distance (P3157) claims. Quantity amount values and units, as well as existing qualifiers and references should be kept.
  • However, plenty claims still have bounds as a leftover from the time when we were not able to use quantities without bounds. I therefore request to remove all “±0” bounds, if no reference is given in the P2043 statement. It might be worth to consider removing all “±[0\.]*1” bounds as well, but I am not fully sure about that (could be repaired manually otherwise as well).

This bot run will affect in total around 2634 claims:

SELECT ?item ?itemLabel ?length ?upperBound ?lowerBound ?diff {
  ?item p:P2043 [ psv:P2043 ?value ] . # items that use P2043 (length)
  ?value wikibase:quantityAmount ?length .
  OPTIONAL {
    ?value wikibase:quantityUpperBound ?upperBound; wikibase:quantityLowerBound ?lowerBound .
    BIND(?upperBound - ?lowerBound AS ?diff) .
  }
  ?item wdt:P31/wdt:P279* wd:Q13406554 . # and have P31 with subclass of sport competition (Q13406554)
#  MINUS { # activate this to filter away items that are related to cycle sport
#    VALUES ?cyclingClasses { wd:Q15091377 wd:Q18131152 }
#    ?item wdt:P31/wdt:P279* ?cyclingClasses . # but not P31 with subclass of cycling race (Q15091377) or stage (Q18131152)
#  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language 'en' }
}

Try it! There is a commented part in the SPARQL query which tests the types of sports which are affected by this bot request (the MINUS section). In fact, in the field of sports events length (P2043) is exclusively used by cycle sport events (defined by types cycling race (Q15091377) and round (Q18131152)). Our cycle sport project members were early adopters of the “quantity with units”-properties, including length (P2043). I therefore already talked to the maintainers of Module:Cycling race at Module talk:Cycling race#event distance (P3157) instead of length (P2043), which heavily uses length (P2043). They support a move to the event-specific event distance (P3157) and have already modified their module to support both properties. Via {{ExternalUse}} in Property talk:P2043 we also identified a frwikinews-Module which needs to be moved, but this is not a complicated task to my knowledge.

In general, event distance (P3157) has some advantages over length (P2043) for events. First of all, racing sports events are not physical objects which have a property “length” as a physical dimension. What one wants to express in these cases is the distance along a path which the event participants have or had to cover during the competition. Secondly, in sports events one often uses rather unphysical distance units such as lap (Q26484625), whose use is better reflected by the event distance property. It is therefore useful to gather all event distance information in one property.

Ping involved users @Molarus, Jérémy-Günther-Heinz Jähnick. Feel free to ping more editors, if necessary.

Thanks, —MisterSynergy (talk) 07:39, 20 February 2017 (UTC)

I heard that @Zolo might be interested as well, due to P2043 use in this context in fr:Module:Infobox/Descriptif course cycliste. This is unfortunately not registered by the {{ExternalUse}} template on Property talk:P2043. —MisterSynergy (talk) 11:53, 20 February 2017 (UTC)
I have searched for templates in most wikis for "P2043" and found it:Modulo:Ciclismo. I have not found a cycling template that reads P2043 data per Module Wikidata, (but I have found a railway template in enWP that uses P2043). We have to edit those Modules after moving the data to the new property as soon as possible. I hope the lua modules are coded well and don´t break.
Wikinews n:fr:Module:Cycling race and the wikis that use our Module:Cycling race will need a new version of this module. I can do this except for esWiki, because I can´t edit there. --Molarus 12:51, 20 February 2017 (UTC)