Wikidata:Bot requests/Archive/2016/10

Duration units for film

Please change these (58 results) from 1 (Q199) to minute (Q7727). It's probably something that needs to be done once in a while. This report allows to monitor it.
--- Jura 08:48, 3 October 2016 (UTC)

  Done Matěj Suchánek (talk) 16:23, 3 October 2016 (UTC)
This section was archived on a request by: Matěj Suchánek (talk) 16:23, 3 October 2016 (UTC)

Help need
Items with Elo rating (P1087) need refreshing. Many of them are 2015 year at least. Data can be harvested from https://ratings.fide.com/ - Kareyac (talk) 15:35, 3 October 2016 (UTC)

This is done by Wesalius. --Edgars2007 (talk) 16:15, 3 October 2016 (UTC)
True, I am working on it, right now I am making sure the script I am going to use wont add duplicate values, soon enough I will add all the available elo ratings. Cheers. --Wesalius (talk) 16:18, 3 October 2016 (UTC)
This section was archived on a request by: --Edgars2007 (talk) 06:18, 12 October 2016 (UTC)

import sitelinks from olowiki

Livvi-Karelian Wikipedia (Q27102215) was created today.
--- Jura 10:04, 11 October 2016 (UTC)

This section was archived on a request by: Edgars2007 (talk) 08:19, 12 October 2016 (UTC)

Remove humans from list of chairpersons

The property chairperson (P488) is supposed to be put on the pages of organizations and link to humans. But 440 pages do it the other way around. Could someone please go through all pages that use chairperson (P488) and are instance of human (Q5) and:

  1. remove chairperson (P488) → Qxxxx
  2. add position held (P39)chairperson (Q140686)of (P642) → Qxxxx

Thank you! --Arctic.gnome (talk) 22:53, 12 October 2016 (UTC)

  Done --Pasleim (talk) 19:09, 14 October 2016 (UTC)
This section was archived on a request by: --Pasleim (talk) 19:09, 14 October 2016 (UTC)

Replace a value with another

Could someone replace the value of instance of (P31) of the following items from episode (Q1983062) to television series episode (Q21191270)?

Thank you -- ★ → Airon 90 17:01, 14 October 2016 (UTC)

  Done --Pasleim (talk) 18:05, 14 October 2016 (UTC)
This section was archived on a request by: --Pasleim (talk) 18:05, 14 October 2016 (UTC)

Move location coordinates from Commons to P625

In the recent update of c:Module:Coordinates Commons pages that have coordinates and are linked to Wikidata item without coordinates end up in c:Category:Pages with local coordinates and missing wikidata coordinates. We should set up a bot to import those coordinates to Wikidata. One way would be to scrape page HTML to capture QuickStatements strings displayed on each page and use those. I would appreciate is someone could set up a process to monitor this category and empty it as more pages show up there. --Jarekt (talk) 16:43, 26 October 2016 (UTC)

Not all the coordinate must be added to Wikidata item. It's necessary check that instance of (P31) is different than Wikimedia category (Q4167836) --ValterVB (talk) 17:02, 26 October 2016 (UTC)
I believe that all q-codes used in the c:template:Object location templates and in quickstatement strings redirect to the article items through category's main topic (P301). They are added using second function in c:Module:Wikidata q-code. --Jarekt (talk) 17:21, 26 October 2016 (UTC)

  Done --Jarekt (talk) 15:30, 1 November 2016 (UTC)

This section was archived on a request by: --Edgars2007 (talk) 19:00, 1 November 2016 (UTC)

BOT for programming project

Hello Wikidata! I want to create a Wikidata Bot for my university course, it's my first time programming bots and I'm a bit lost... can someone help me or show me a starting point or something? Thanks a lot!  – The preceding unsigned comment was added by Gonzmg (talk • contribs).

You can start from here or here --ValterVB (talk) 17:25, 21 October 2016 (UTC)
You can also find some links and example scripts here. Edoderoo (talk) 14:48, 23 October 2016 (UTC)
This section was archived on a request by: XXN, 14:45, 17 November 2016 (UTC)

Item located in Rome, with coordinates not in Rome

I just discovered that Embassy of Sweden, Rome (Q5369877) had wrong coordinates putting it in Milano, hundreds of kilometers from Rome.

Since this item is located in the administrative territorial entity (P131) Rome (Q220), would it not be possible to detect such inconsistencies?

A bot could use an OpenStreetMap shapefile (good opportunity to make sure all OpenStreetMap countries and major cities are linked to Wikidata), or any other reverse geocoding solution, and generate a list of potential errors for human editors to check.

Any volunteer? :-) Syced (talk) 03:35, 12 October 2016 (UTC)

@Syced: maybe this will be enough? Query:
#defaultView:Map
select * where {
  ?item wdt:P131/wdt:P279* wd:Q220 .
  ?item wdt:P625 ?coords .
  }
Try it!
--Edgars2007 (talk) 06:16, 12 October 2016 (UTC)
The human who investigates whether the items really are in Rome would have to be quite familiar with Italian laws about municipalities; sometimes cities have unexpected jurisdiction over places that seem remote from the city. Jc3s5h (talk) 09:08, 12 October 2016 (UTC)
Hello Edgars2007! Rome was only an example :-) Checking using shapefile is needed here, because location borders sometimes include remote places: For instance Tokyo includes a few tiny islands very far in the Pacific Ocean. Cheers! Syced (talk) 09:12, 12 October 2016 (UTC)
@Syced: This query lists all places in Italy (Q38) which are further than 50km from their city (Q515):
SELECT ?place ?placeLabel ?placeDescription ?location ?locationLabel ?dist WHERE {
  ?place wdt:P17 wd:Q38;
         wdt:P625 ?coord;
         wdt:P131 ?location .
  ?location p:P31/ps:P31/wdt:P279* wd:Q515;
            wdt:P625 ?locationCoord .
  BIND( geof:distance(?coord, ?locationCoord) AS ?dist ) .
  FILTER( ?dist > 50 ) .
  SERVICE wikibase:label {
    bd:serviceParam wikibase:language "en" . 
  } .
} ORDER BY DESC(?dist)
Try it!
On the other hand, what you suggest would propably help in 100% cases whereas queries are really limited to simple circles.
Matěj Suchánek (talk) 14:14, 12 October 2016 (UTC)
This section was archived on a request by:
--- Jura 12:08, 21 November 2016 (UTC)

Removing user page links from Items

On Wikidata:Database reports/User pages are a lot of user pages listed (mainly to en:wp, bexold:wp, fa:wp and pl:wp). Could a bot remove all these links from the items, because they are not fulfilling the notability criteria since user pages are generally not accepted? Steak (talk) 20:38, 26 October 2016 (UTC)

  Oppose. These need to be examined on a case-by-case basis. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:48, 27 October 2016 (UTC)
Yes, templates located in user_ns is for example not a big deal at all. -- Innocent bystander (talk) 14:31, 27 October 2016 (UTC)
Yeah. Some wikis deliberately put user box templates in the user namespace as subpages of a special user (e.g. be-x-old, es, fa). I don't see any reason to remove ones like that, removing them doesn't improve anything, it just breaks the interwiki links. We have Wikidata:Use common sense for cases like this. :) - Nikki (talk) 20:52, 27 October 2016 (UTC)
I've started going through the User:Candleabracadabra pages. They appear to all be places listed in the US "National Register of Historic Places" which should be notable even without sitelinks. Please don't remove the sitelinks for those yet, there is still useful information on the pages which should be added to Wikidata. - Nikki (talk) 21:44, 27 October 2016 (UTC)
The ones I had looked at all seem to be duplicates.
--- Jura 22:45, 27 October 2016 (UTC)
This section was archived on a request by: Something can be done about these, but not this.
--- Jura 12:07, 21 November 2016 (UTC)

Statements using P2044 with unit, precision and multivalued references

I would like to add the following statements, but using:

I think this is currently not possible with QuickStatements, ping Magnus.

Thanks in advance. --abián 16:21, 16 October 2016 (UTC)

@Abián: is Alborge (Q1650353) what you want? P.S. If Great Aragonese Encyclopedia ID (P1807) is used in all reference URL (P854) fields, then the statement should look a little bit different (like this). --Edgars2007 (talk) 16:27, 17 October 2016 (UTC)
@Edgars2007: That's it. :D Indeed, Great Aragonese Encyclopedia ID (P1807) is used in all reference URL (P854) fields, so you can use the first property instead of the second. You can also use ±30 instead of ±50, I've been investigating a little more. --abián 17:20, 18 October 2016 (UTC)
@Abián: I have written the script for this, but we/you have to decide what to do with precision. See Wikidata:Project chat#+- workaround. --Edgars2007 (talk) 15:48, 24 October 2016 (UTC)
@Edgars2007: Thanks for your help. Has this issue concerning precision been completely solved? --abián 12:01, 17 November 2016 (UTC)
@Abián:   Done. Las Peñas de Riglos (Q1768883) already had elevation above sea level (P2044) set, so I skipped it. --Edgars2007 (talk) 15:47, 27 November 2016 (UTC)
Great! Thanks again, Edgars2007. --abián 15:57, 27 November 2016 (UTC)
This section was archived on a request by: --Edgars2007 (talk) 15:47, 27 November 2016 (UTC)

Bjankuloski06 added thousands of instance of (P31)human settlement (Q486972) to items that already contain a subclass of that. Can someone mass-revert? Sjoerd de Bruin (talk) 07:11, 25 October 2016 (UTC)

Suggest reverting only on items where there was no other instance of (P31) claim. Q12911027 is fine, for example. @Bjankuloski06: perhaps you can revert yourself. --Izno (talk) 11:48, 25 October 2016 (UTC)

@Izno: Since I lack the experience, I would appreciate if you can do it for me, as it needs selecting those that are doubled-up or unnecessary. Sorry about the mistake. --B. Jankuloski (talk) 14:43, 25 October 2016 (UTC)

I'm sure someone can make a nice sparql query to get items that have both instance of human settlement (Q486972) and instance of some subclass of human settlement (Q486972). Next step would be to remove it with quickstatements. Anyone? Multichill (talk) 18:01, 25 October 2016 (UTC)
  Comment This rises some questions I have had regarding P31:minor locality in Sweden (Q14839548) and P31:urban area in Sweden (Q12813115)-claims. I am currently adding such claims into items who already have P31:human settlement (Q486972)-claims. I am doing this intentionally without removing the P31:Q486972-claims since P31:Q14839548 and P31:12813115 always should have at least a start date and sometimes also an end date. But that something has started or ended being a Q14839548, does not mean that it has started or ended being a Q486972. P31:Q486972 probably have to stay as "backup" in items with P31:Q14839548 and P31:12813115-claims. Maybe such statements even have to be added to such items, instead of being removed.
To make it more obvious: Solsidan (Q1800816) became a minor locality in Sweden (Q14839548) in 1990, and ended that in 2005, when it became a urban area in Sweden (Q12813115). But there have been living people here, probably at least since Iron age. Removing a :P31:human settlement (Q486972) in such cases, would say that nobody lived here before december 31, 1990. What the P31:Q14839548 (start date 1990)-claim says, is that Statistics Sweden started to recognise the place in 1990. -- Innocent bystander (talk) 18:39, 25 October 2016 (UTC)
If you can make a more-specific claim than human settlement, that's probably best. But if you can't, then you should use "human settlement" with an end date of 1990 IMO. --Izno (talk) 18:50, 25 October 2016 (UTC)
If I had access to the National archives and unlimited measures of time, I could look into older official records, but I am afraid I have some limits. -- Innocent bystander (talk) 18:59, 25 October 2016 (UTC)
I see nobody made the query yet.
SELECT ?item ?instance WHERE {
  ?item wdt:P31 wd:Q486972 .
  ?item wdt:P31 ?instance .
  ?instance wdt:P279* wd:Q486972 .
  ?item wdt:P31 ?instance .
  FILTER(?instance != wd:Q486972) .
  }
Try it!
I would advice against mass removal here for everything, probably only focus on the country where it was mass added. What country was that? @Bjankuloski06, Sjoerddebruin: Multichill (talk) 18:36, 29 November 2016 (UTC)

@Multichill: Well, it must have been my own, Macedonia. That's what I was trying to add. I apologise, as I thought at the time that it is fine to add that to the pre-existing village, and later found out how to exclude those that had it. I hope it isn't a problem to remove all instances of 'human settlement' (P31) in items having both 'village' and 'human settlement'. --B. Jankuloski (talk) 09:19, 30 November 2016 (UTC)

@Bjankuloski06: no worries, this looks like it's easy to correct. I found 311 items in North Macedonia (Q221) which had both human settlement (Q486972) and village (Q532). I'm removing the redundant human settlement (Q486972) now (using petscan). So I guess this requests is   Done. This leaves us with about 250 items that seem to be part of some bot import. Maybe you can have a look at them and improve them? See
SELECT ?item ?itemLabel ?instance ?instanceLabel WHERE {
  ?item wdt:P31 wd:Q486972 .
  ?item wdt:P17 wd:Q221 .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "en,mk,ceb". }
  }
Try it!
Thank you, Multichill (talk) 11:38, 30 November 2016 (UTC)
This section was archived on a request by: Multichill (talk) 11:41, 30 November 2016 (UTC)

MySpace IDs

Unfortunately, en.Wikipedia deleted its MySpace template, with each transclusions substituted. This left over 3,000 MySpace URLs in articles using ordinary wikitext links. Can someone import the values to Myspace ID (P3265), please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:33, 6 October 2016 (UTC)

@Pigsonthewing:   Doing… about 1,200 2,000. For you or anyone who would like to do a similar import, here is my approach:
  1. fork this query on Quarry
  2. replace all (two) "P###" with your property
  3. replace all (two) "http://..." with the string before "$1" in the property's formatter URL (P1630) and append a "%" to the second one
  4. replace "Q###" in the selection with wiki you're importing from ("Q328" for enwiki)
  5. you may need to filter results using a subquery which requires knowing the database schema
  6. download data as TSV, replace all """ with " and copy to QuickStatements
This is possible since MediaWiki saves external links per page to the database. Matěj Suchánek (talk) 16:29, 17 December 2016 (UTC)
  Done Imported what was possible, now there are 400 left that should be filtered and added manually. Matěj Suchánek (talk) 19:22, 17 December 2016 (UTC)
This section was archived on a request by: XXXXX Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:50, 18 December 2016 (UTC)

Remove redundant described at URL (P973)

P973 is meant to be used if there is no other property available. This lists items where another property duplicates the information.
--- Jura 12:46, 30 October 2016 (UTC)

@Jane023:. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:08, 1 November 2016 (UTC)
I am not sure I agree with the above statement. What's wrong with putting urls in P973? Jane023 (talk) 18:41, 1 November 2016 (UTC)
Because in this case they are the same in P973 and Ted talk ID. --Edgars2007 (talk) 18:59, 1 November 2016 (UTC)
Yes but I do this for paintings too. In the case of TED talks there may not be any other properties ever attached to these items but in the case of paintings or people there are sometimes quite a few. Only the ones that describe the object belong in this property however. Not everyone knows which property to choose for the best description and the described at url is a convenient location for everyone to recognize. Jane023 (talk) 08:22, 6 November 2016 (UTC)
It might be that you add them before the new property was available. We just need to finish the conversion. If you are not sure about how to use a property, please have a look at its talk page.
--- Jura 14:03, 20 November 2016 (UTC)
I think described at URL (P973) should be removed. described at URL (P973) is suited to link to a database which doesn't have a own property. However, after a property for a database was created we shouldn't maintain that property and described at URL (P973). --Pasleim (talk) 11:35, 27 November 2016 (UTC)
This section was archived on a request by: --Edgars2007 (talk) 15:34, 17 January 2017 (UTC)

Automatically creating a human subclass for anatomical features that don't already have subclasses

Some statements are true for the fingers of every species but others are human specific. Currently we often don't have separate items for the concept in humans. I think it would be valuable to have a bot that automatically creates human subclasses. ChristianKl (talk) 09:45, 22 September 2016 (UTC)

Can you please provide a list with all anatomical features? --Pasleim (talk) 11:56, 26 September 2016 (UTC)
We have animal structure (Q25570959). That then get's subclassed in different ways. That should produce a long list of anatomical features where most exist in humans. ChristianKl (talk) 14:43, 1 October 2016 (UTC)

Guardian data about US police killings

https://www.theguardian.com/us-news/series/counted-us-police-killings provides data about individuals in the US who were killed by the police. Should we import the data? If we import the data it might also be interesting to make a public statement that invites other people to contribute data about those people. ChristianKl (talk) 14:38, 26 September 2016 (UTC)

Import it to what item? Jc3s5h (talk) 15:38, 26 September 2016 (UTC)
Items for the people who are killed. The Guardian lists their names and data about them. It would be possible to automatically create lists in Wikipedia that show all police killings in month X. ChristianKl (talk) 16:26, 26 September 2016 (UTC)
I object to another bot that will create lots of items without making an effort to see if there is already an item for the person. Of course, if there were an existing item it would be necessary to rigorously investigate whether the person who was killed was the same person named in an existing item. I realize that occasionally duplicate items will be created accidentally, but doing it en mass with a bot doesn't seem like a good idea to me. Jc3s5h (talk) 17:07, 26 September 2016 (UTC)
Why? Merging items is easy. Especially with the merging game. ChristianKl (talk) 17:38, 26 September 2016 (UTC)
What data? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:19, 26 September 2016 (UTC)
There seems to be something of an API: https://interactive.guim.co.uk/2015/the-counted/records/20162.json, although it isn't documented. Kaldari (talk) 22:45, 28 September 2016 (UTC)
Names of the people who are killed. manner of death (P1196) "killed by police gunshot" (and other classes for deaths that aren't gunshots). date of death (P570). ethnic group (P172). I think there's interest in having Wikipedia lists of people killed by police by race. ChristianKl (talk) 23:18, 28 September 2016 (UTC)

MCN number import

There are 10,031 identifiers for MCN code (P1987) that can be extracted from [1] or this English version. Many (but not all) items cited are animal taxons, which can be easily machine-read. For the rest, it would be useful if the bot generated a list presenting possible meanings (by comparing the English and Portuguese versions of the xls file with Wikidata language entries). Pikolas (talk) 12:38, 14 August 2015 (UTC)

What's the copyright status of those documents? Sjoerd de Bruin (talk) 13:04, 14 August 2015 (UTC)
It's unclear. I've opened a FOIA request to know under what license those are published. For reference, the protocol number is 52750.000363/2015-51 and can be accessed at http://www.acessoainformacao.gov.br/sistema/Principal.aspx. Pikolas (talk) 13:40, 14 August 2015 (UTC)
I heard back from them. They have assured me it's under the public domain. How can I prove this to Wikidata? Pikolas (talk) 01:48, 2 October 2015 (UTC)
@Pikolas: I have only just noticed that you haven't had the courtesy of a reply. The best method would be to get them to put a statement to that effect on their website. Failing that, you could get them to email OTRS. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:55, 10 October 2016 (UTC)
@Sjoerddebruin: Reopening this thread since I forgot to ping you. NMaia (talk) 15:45, 1 June 2016 (UTC)
Updated links: Portuguese version, English version. NMaia (talk) 19:35, 2 June 2016 (UTC)

P775

There are some updates related to items with Swedish urban area code (P775) in the pipeline! A number of items should have this change (the add of P813 in the reference is probably optional). All of them should have a P31:Q12813115-claim already, but if they are missing it, add it! This relates to all items who have P775 with any of these values. If there is no item with any of these PP775-values, please let me know! -- Innocent bystander (talk) 06:02, 26 October 2016 (UTC)

Sitelink removal

This lists pages at Wikipedia that are not disambiguations, but are linked from items that have P31=Q4167410. Could you remove them? I will add them to appropriate items by QuickStatements afterwards.
--- Jura 04:25, 25 October 2016 (UTC)

I strongly oppose the use of a bot before cleaning out all the items on the list which aren't given names, because the categories concerned are not only "given name" and 'disambiguation" but also 'surnames', for example. I'm working on a equivalent list since early September and I don't see why we should do it badly with a bot when we'll still have to pass individually on each article to clean it correctly. The list was nearly twice as long when I started and it's going down steadily. --Harmonia Amanda (talk) 05:27, 25 October 2016 (UTC)
I agree that some should be given names, other "name" item. Don't worry about that.
--- Jura 05:45, 25 October 2016 (UTC)
Uh yes, I worry! How exactly do you intend to treat it? When I see your query happily mixing names, given names and disambiguation pages (because the sitelinks other than the English one can have always been genuinely a disambiguation page) and you only say "I'll treat it", I worry. --Harmonia Amanda (talk) 06:44, 25 October 2016 (UTC)
Feel free to do it manually. Please make sure to not re-purpose any item.
--- Jura 22:46, 27 October 2016 (UTC)
Refers to Wikimedia disambiguation page (Q4167410). Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:06, 25 October 2016 (UTC)

Copy data from the property documentation to property statements

The syntax in which the formatter url and other statements are stored in the property documentation should be easy to understand for a bot, that can automatically create statements from them. ChristianKl (talk) 11:34, 1 October 2016 (UTC)

This should be "move", not "copy". Some statements are suitable for this; others not. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:46, 1 October 2016 (UTC)
Wikidata:Requests for permissions/Bot/MatSuBot 2. Matěj Suchánek (talk) 18:58, 1 October 2016 (UTC)

{{Section resolved|1=Sjoerd de Bruin (talk) 08:55, 12 October 2016 (UTC)}}

Postponing archiving per Andy's comment and Topic:Td698cyh1l7depz2. My bot's RfP only allows copying, not (re)moving data from talk pages. So this can remain an open task, although I can imagine someone would like to keep it more detailed than what property statements can provide. Matěj Suchánek (talk) 17:33, 14 October 2016 (UTC)
  Doing… Matěj Suchánek (talk) 13:20, 8 April 2017 (UTC)
This section was archived on a request by: Matěj Suchánek (talk) 16:18, 29 May 2017 (UTC)