Wikidata:Bot requests/Archive/2015/09

Adding about 9000 instance of (P31)

To all item in this query a bot should add instance of (P31)=sports season of a sports club (Q1539532) and sport (P641)=association football (Q2736) (they are too much to do with Autolist). Thanks in advance--Dr Zimbu (talk) 12:07, 13 September 2015 (UTC)

  Doing… - with Autolist :) --Stryn (talk) 14:45, 13 September 2015 (UTC)
...and   Done. --Stryn (talk) 16:02, 13 September 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 16:26, 15 September 2015 (UTC)

Interwiki bot

There are many templates on en.wikipedia list which have old-interwiki inside them. please remove them by bot. some of them didn't add to wikidataYamaha5 (talk) 05:01, 9 September 2015 (UTC)

@Ladsgroup: ^ Multichill (talk) 17:10, 9 September 2015 (UTC)
رضا جان قربون شکلت. می‌بینی که هر جا سخن از ربات اینترویکی است نام من می‌درخشد. I will do the clean upAmir (talk) 18:01, 9 September 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 11:08, 16 September 2015 (UTC)

Moving coordinates to headquarters

In a case of company items instance of (P31)=company (Q783794) or business (Q4830453) the coordinates coordinate location (P625) are related to headquarters headquarters location (P159), so if listed separately (WDQ (CLAIM[31:783794] OR CLAIM[31:4830453]) AND CLAIM[625] AND CLAIM[159]), they need to be moved as qualifier of headquarters location (P159). Any bot can help me with this (app 200 items)? --Jklamo (talk) 20:30, 13 September 2015 (UTC)

  Done --Pasleim (talk) 11:17, 16 September 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 11:17, 16 September 2015 (UTC)

Auto-populate all the Chinese administrative subdivision items (request withdrawn)

There are an awful lot (possibly millions) of items that have a "China administrative division code" property and value but no label or description. The names of these places are obtainable from http://www.stats.gov.cn and could be retrieved by a program. For example, Q13685144 has a code of 50 01 17 114 211. Using that code you can drill down, starting from this page, to here to get the Chinese name of the item (福林村委会) and its parent (清平镇). I don't actually have a use for this information, but the Completer/Finisher in me thinks it would be better to have items that are labelled than just left blank. --Heron (talk) 21:18, 18 September 2015 (UTC)

That item and the place it's located in both already have Chinese labels so I'm not sure what could be taken from that site that we don't have already. - Nikki (talk) 10:46, 19 September 2015 (UTC)
I'm sorry, Nikki. I've just worked out that "No label defined (Q13685144)" means no label defined in English. By changing the language to 中文(繁體) I get to see the Chinese label which, as you said, is present in that item. Please ignore my request. --Heron (talk) 19:28, 19 September 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 19:57, 19 September 2015 (UTC)

add ukwiki category by years

add ukwiki category by years.

eg:

@VaeVictis: to do this yourself with QuickStatements, you would need to generate a list like this on a spreadsheet:

Category:1333	Sukwiki	"Категорія:1333"

and fill in "First column are articles from" with "enwiki". It doesn't matter if the ukwiki category actually exists or is already defined. If some other sitelink for ukwiki is defined, it would be overwritten. --- Jura 05:41, 24 September 2015 (UTC)

something did not work. i do not know much about it, and not very know English. it can be done automatically by the bots? --VaeVictis (talk) 11:25, 25 September 2015 (UTC)
  Done --Pasleim (talk) 12:48, 25 September 2015 (UTC)
Thanks --VaeVictis (talk) 14:16, 25 September 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 21:35, 29 September 2015 (UTC)

Find and merge duplicates (specific case)

Hi. Yesterday EmausBot has created several hundreds of items for categories for BC years, connecting ro: and bs: pages. But most of them are duplicates for other already existent items. It's necessary to find existent items and to merge newly created items into the old ones.

Pattern for search:

New items contains pages
Old items contains (some of) pages
  • war:Kaarangay:"year" UC
  • sh:Kategorija:"year". pne.
  • vi:Thể loại:"year" TCN
  • uz:Turkum:Mil. av. "year"

--XXN, 20:55, 28 September 2015 (UTC)

done by User:Ivan A. Krestinin --Pasleim (talk) 22:06, 29 September 2015 (UTC)

This section was archived on a request by: --Pasleim (talk) 22:06, 29 September 2015 (UTC)

Every country hat has ISO 3166-1 alpha-2 code (P297) could automatically get a Regional Indicator Symbol as Unicode character (P487). Its essentially the same sequence of character but each character must be replaced by correspondig RI-equivalent.

A → 🇦
B → 🇧
C → 🇨
D → 🇩
E → 🇪
F → 🇫
G → 🇬
H → 🇭
I → 🇮
J → 🇯
K → 🇰
L → 🇱
M → 🇲
N → 🇳
O → 🇴
P → 🇵
Q → 🇶
R → 🇷
S → 🇸
T → 🇹
U → 🇺
V → 🇻
W → 🇼
X → 🇽
Y → 🇾
Z → 🇿

So JM should become 🇯🇲. The bot job would be:

  1. Pick up the sequence in ISO 3166-1 alpha-2 code (P297)
  2. Convert it to a Region Indicator
  3. save it to Unicode character (P487) if its not already there.

--Shisma (talk) 17:57, 2 September 2015 (UTC)

  Support as discussed with Shisma. VIGNERON (talk) 11:04, 3 September 2015 (UTC)
You could try to use the list from TAB
and then build a list for quick_statements and upload it directly
  Oppose
  1. violates the “single character” constraint of Unicode character (P487)
  2. I think it’s too much of a stretch to say that 🇺🇸 is “the” Unicode character (or character sequence) of United States of America (Q30). As I understand it, Unicode character (P487) is meant more for items about individual characters, like A (Q9659). —Galaktos (talk) 15:54, 11 September 2015 (UTC)
  Oppose per Galaktos --Pasleim (talk) 16:27, 15 September 2015 (UTC)


This section was archived on a request by: --- Jura 16:31, 13 October 2015 (UTC)

Date for birth for recently deceased

For items appearing on Wikidata:Database reports/Recently deceased at Wikipedia, it might be worth attempting to import some data automatically (date of birth, place of birth, gender, nationality, identifiers, occupation, BUT NOT date of death) or on-demand/subject to review (this could include date of death/place of death).

@Hsarrazin:: as you worked on them quite a lot: what do you think? --- Jura 10:19, 13 September 2015 (UTC)

the problem with auto-imported data, for now, is that they are never sourced and it is really hard to find from which language the data comes, and finding the source in the language article is often tricky.
wp projects (at least fr and de) accuse wikidata of not properly sourcing the data, but I could often see that a lot of wp articles are poorly or not sourced at all :( - retrieving source whenever possible is very important.
personally, as I work on death date, I manually systematically (or nearly) import date/place of birth, gender, nationality, identifiers, occupation, using User:Magnus Manske/wikidata_useful.js. I also add given name (P735) whenever I can sort it out (hungarians are tricky).
bot-importing those data would be risky ; an on-demand retrieving tool (like User:Magnus Manske/import statements.js but working) or a reviewing tool (like Primary source tool) could be more efficient, but for most recent deaths, the articles are very young, often unstructured, and data is hard to auto-retrieve, and often harder to eye-retrieve.
it would be important to have data imported from all linked languages, when different, so that different data on different languages could be reviewed. (en is the most common, but not necessarily the most accurate).
  • auto-importing date of death for recent death is NOT good, as errors and spam are quite high :)
  • info could be retrieved from Infobox (when existing), categories or plain text (like WD Useful).
  • data to retrieve : date of birth (P569), date of death (P570), place of birth (P19)/place of death (P20), country of citizenship (P27), occupation (P106), image (P18) would be nice too :)
  • sex or gender (P21) is often tricky in many languages. I often use picture to determin it - maybe in a reviewing tool.
  • VIAF ID (P214) would be nice too, but I often found those wrong or outdated on various wp. I prefer to add it manually from personal search (it's my day-job), and import other IDs from VIAF with User:Tpt/viaf.js.
  • when different data exist in different wikilinks, each should be retrieved and at least marked as imported from Wikimedia project (P143)+language link on each auto-imported data.
  • such a tool should also try to retrieve sourcing info (when available) from each existing language link (something like Sourcerer that don't work for me).
  • and whenever possible, add date of input (like you do on the "recent death" report).
  • an on demand (button) tool, or a better, a reviewing tool would be very useful.
such a tool (similar to Harry Potter's magic wand) that would retrieve such data could also be useful on any human (Q5) item, so I would like it very much ;) --Hsarrazin (talk) 13:42, 13 September 2015 (UTC)
To some extent "Primary Sources" (indirectly) and "Suggestor" (with lots of noise) end up doing that, but a dedicated tool would indeed help.
I suppose its overall usefulness depends on how close (or how far) we are from importing all ±formatted data from Wikipedia. --- Jura 14:04, 13 September 2015 (UTC)
I don't know Suggestor, but existing tools I know mainly work from enwiki. There is a lot of work to autofetch info from other languages, I think. :) --Hsarrazin (talk) 20:58, 13 September 2015 (UTC)
There is now https://tools.wmflabs.org/pltools/recentdeaths/ to add the date of death. --- Jura 13:37, 23 September 2015 (UTC)
This section was archived on a request by: --- Jura 12:06, 4 December 2015 (UTC)

Could someone please harvert the line id= from the template en:Template:BCGNIS in enwiki and put the result in BC Geographical Names ID (P2099)? --Fralambert (talk) 00:21, 24 September 2015 (UTC)

imported with harvest templates all values which are not used as reference in the article. --Pasleim (talk) 10:45, 5 November 2015 (UTC)
This section was archived on a request by: --- Jura 12:06, 4 December 2015 (UTC)

Check links between article and categories items

Analyze Wikipedia and sister project pages from main space and categories for linking through Property:P301 and Property:P910. Categories should have w:en:Template:Catmore, pages in main space - special sort keys (space, *, etc). Will be also good idea to check if category item for Commons link with page Property:P373 (same, absent, etc). Probably report should be enough to avoid false positives. --EugeneZelenko (talk) 14:13, 6 September 2015 (UTC)

Непонятно, в чём вопрос. Вы предлагаете рассоединить категории и статьи из одного элемента? Про Викиновости знаете? --Infovarius (talk) 21:26, 6 September 2015 (UTC)
Я предлагаю проверить, соединены ли элементы для статей и категорий. --EugeneZelenko (talk) 02:13, 7 September 2015 (UTC)
Some reports you can find at violation constraints: Wikidata:Database reports/Constraint violations/P301#Inverse and Wikidata:Database reports/Constraint violations/P910#Inverse. I suppose that categories with Template:Catmain were already imported into Wikidata, at least once. --Infovarius (talk) 08:43, 10 September 2015 (UTC)

Harvest elevation above sea level (P2044) from Swedish Wikipedia

Could someone harvest the value for elevation above sea level (P2044) from Swedish Wikipedia (svwiki) where it is available in articles that use the template sv:Mall:Insjöfakta Sverige in the parameter höjd. Size of harvest would be approximately above 55000 items affected. Ainali (talk) 22:32, 10 September 2015 (UTC)

@Ainali: Take a look at Galaktos' comment about this property at WD:PC. Do we (or at least Nasko) know which level this is relative to? -- Innocent bystander (talk) 06:36, 11 September 2015 (UTC)
I checked with SMHI, and they have different tables for height, which uses different relative levels. So unless Nasko can tell us which one was used, I guess this request should be put on hold. Ainali (talk) 11:06, 11 September 2015 (UTC)
I am not sure I really care about the exact level used for the moment. Thierry Caro (talk) 11:21, 11 September 2015 (UTC)
post-glacial rebound (Q1161410) is a very big issue in parts of this set of articles, so I think this really is an important question. Q18182959 is for example located 0 meters above sea level according to this source. If that was calculated compared to the de facto sea level, this lake would have been flooded half of the time, but it isn't. In fact the stream that runs from this lake to the sea have some tiny waterfalls in it and the lake has not been flooded for maybe 200 years. -- Innocent bystander (talk) 13:40, 11 September 2015 (UTC)
All the small lakes user the geodetic system RT38 and RH70 for altitude. Larger lakes uses RT90 and RH00 or SWEREF99 and RH2000. Now what is the difference between these geodetic systems and WGS84? Then it comes to altitude two factor are important. The first one is the distance from the earths center of mass (the "radius"). The second is flattening of the poles. The northern hemisphere is actually flatter than the southern because of an old impact.
RT90 uses geodetic system=Bessel 1841, Semi-major axis=6377397.155 m, Flattening of the poles=1/298.1528128
WGS84 uses geodetic system=WGS 84, Semi-major axis=6378137 m, Flattening of the poles= 1/298.257223563
When comparing a traditional leveling and satellite based altitude information, the altitudes may differ by many meters.[1] [2] [3] (swedish) Other factors: Different seas has different sea-levels (10-20 cm). The gravity from under-water mountains attracts water and the sea rises several meters. The moon causes tidal waves. Isostasy is also a factor, but it needs hundreds of years to shift the land several meters. And so on...[4] (english)
Conclution: Traditional leveling and satellite based altitude information will give different results especially when local geodetic systems and leveling is compared to modern satellite based surveying techniques. I would harvest the altitudes. A simple version of the Swedish national altitude database was released under a free license this summer, so it is possible to improve the data later if it is needed. Nasko () 17:23, 11 September 2015 (UTC)

Fix redirects from pagemoves at Wikipedia (bug T92789)

Per Wikidata:Project chat#How_to_get_rid_of_redirects_.28in_sitelinks.29, some of the redirects are due to a bug in the software. These should be fixed: the redirect in sitelink replaced by target article.

If there is a way to identify that the other page moves at Wikipedia that haven't be updated at Wikidata at the same time, this could help. --- Jura 07:17, 29 September 2015 (UTC)

As a first step and following a request, here is a query that can help for enwiki: http://quarry.wmflabs.org/query/5392
It filters on pagemove edit summaries (excluding many others). Some of the data is stale and needs purging of the page at enwiki. --- Jura 10:52, 29 September 2015 (UTC)

Footballer senior career and national team

Hi. I'm working on using wikidata data in french footballer infobox. I realize that many information of "member of sports team" (P:54) are missing, whereas the information is described in English infobox (and italian and french, also). I've just verified in Bot requests archive and I've found this one, dating from April 2013 (ping @Sven Manguard, Legoktm:). It is exactly the point : "It seems like it'd be an easy task for a bot to read the "Infobox football biography" template over on Wikipedia for the "Senior career" and "National team" fields and then use those to populate "member of sports team" (P:54). Once we get qualifiers, the bot can then copy over the years field attached to each entry as well.".

So, do you think a bot could do (finish) the job ? Thanks in advance. --H4stings (talk) 09:00, 28 September 2015 (UTC)

My code for this is [5], and I think I ran it at some point. I unfortunately don't have much time to pick up any new tasks :( Legoktm (talk) 22:02, 1 October 2015 (UTC)
Ok, thanks. I hope some other football fan bot runner could adopt this cause.   --H4stings (talk) 13:21, 2 October 2015 (UTC)

Update: I've created my bot and adapted some scripts, and I am doing the job (from French then English infoboxes). But delay between insertion (10 to 20 seconds) is that it will last months. Is there anything to do to save time ? --H4stings (talk) 08:21, 11 March 2016 (UTC)

@H4stings: What's status of this? Matěj Suchánek (talk) 13:45, 16 April 2016 (UTC)
@Matěj Suchánek: it's working: my bot has already treated nearly 50000 football players (= 1.2 million edits). The delay between edits was simply caused by parameter put_throttle, as explained in Wikidata:Creating_a_bot#Configuration. --H4stings (talk) 20:00, 16 April 2016 (UTC)
This section was archived on a request by: Matěj Suchánek (talk) 20:14, 16 April 2016 (UTC)