Wikidata:Bot requests/Archive/2017/08

Band members - create "has part" / "member of" statements from infoboxes

Request date: 11 August 2017, by: Ejegg

Task description

Many relationships between bands and their members are missing in Wikidata, but are present in a standardized infobox field on Wikipedia.

A patch is available for PyWikiBot to allow harvesting multiple values for a statement: https://gerrit.wikimedia.org/r/370664 I'd like to use that to fill in missing relationships.

harvest_template -lang:en -family:wikipedia -namespace:0 -template:"Infobox musical artist" current_members P527 -islink -exists:p

It could also be interesting to harvest time start_date and end_date qualifiers from infoboxes, but that will require another patch.

Discussion

@Ejegg: if you want to request a bot flag, see Wikidata:Requests for permissions/Bot. --XXN, 20:29, 11 August 2017 (UTC)

@XXN: Thanks! I've created User:BandMemberBot and Wikidata:Requests_for_permissions/Bot/BandMemberBot. Should I delete this section? --Ejegg, 06:46, 13 August 2017 (UTC)

I'm tagging the section with {{Section resolved}}, and it will be archived in 2 days. --XXN, 14:21, 13 August 2017 (UTC)
Request process
This section was archived on a request by: XXN, 14:21, 13 August 2017 (UTC)

Q208696 (large page warning)

The above item includes a duplicate references in statements: closing credits (Q1553078) appears twice. Maybe there is way to remove one instance each by bot and slightly reduce the size of the item.
--- Jura 19:06, 14 August 2017 (UTC)

  Done --Pasleim (talk) 20:21, 14 August 2017 (UTC)
Thanks. It reduced page size by about 140KB: https://www.wikidata.org/w/index.php?title=Q208696&action=history (still 651,613 bytes).
This section was archived on a request by: --Pasleim (talk) 20:21, 14 August 2017 (UTC)

Correcting false precision of geo co-ordinates

Request date: 27 July 2017, by: MartinPoulter

Link to discussions justifying the request

I have bulk uploaded data about geographical locations from an external site. It has been pointed out that some of the data have lat/long precision up to eight or nine decimal places, when four or five would suffice, and this introduces useless information into Wikipedia articles where these coordinates are included. In effect we have millimetre-level precision for sites of the order of 10m in size. This is a cleanup task that I should have done before importing the data, so I apologise for creating work for others.

Task description

For the (presently 2783) items identified by this query, replace the coordinate location (P625) latitude/longitude values with the same numbers rounded to four decimal places. Thanks in advance for any help.

Discussion


Request process

Task completed: I chose a precision of five decimal places to get a precision of about 1 meter. --Pasleim (talk) 13:50, 16 August 2017 (UTC)

This section was archived on a request by: Pasleim (talk) 13:50, 16 August 2017 (UTC)

Request clean a data load

Request date: 24 August 2017, by: Salgo60

Task description

I loaded P4159 using Quickstatements and have some problems e.g.

  • Q7726#P4159 has
    • Joseph Bonaparte (1)
    • Joseph_Bonaparte_(1)
Best is If the first one (without _) is deleted. See error list 393 errors
My question is what is the best way?!?!? Can it be done with PETSCAN?!?! If some Python needs to be written I am more than interested to learn how as I also take care of Property:P3217 and need to start learning Python and Wikidata to add references to facts stated by Property:P3217 .... - Salgo60 (talk) 16:07, 24 August 2017 (UTC) email: salgo60@msn.com
Discussion
  Done using QuickStatements. Matěj Suchánek (talk) 16:47, 24 August 2017 (UTC)
Request process
Task completed (16:47, 24 August 2017 (UTC))
This section was archived on a request by: Matěj Suchánek (talk) 16:47, 24 August 2017 (UTC)

200 Creator templates to import

Request date: 6 June 2017, by: Jarekt

Link to discussions justifying the request
Task description

c:Category:Creator templates with authority control data holds Creator templates with authority control identifiers but without link to Wikidata item. In the past we managed to either match all such pages with existing items or create new ones, but a new batch was created. Can someone help with this? I would:

  1. perform search for the names and check if there are any matches (I already did a search based on VIAF and did not find any hits)
  2. create new items and copy as much data as resonable
  3. add item q-codes to Creator pages, or give them to me and I will add them.
Discussion
@Jarekt: I can only see eight pages in the category. Can this be closed? Matěj Suchánek (talk) 07:30, 25 August 2017 (UTC)
Yes, I have done it through some Quick Statements and a lot of manual work. --Jarekt (talk) 02:49, 26 August 2017 (UTC)
Request process
This section was archived on a request by: Jarekt (talk) 02:50, 26 August 2017 (UTC)

Lowercase "category:"

It seems that some item have lowercase "category:" in its label (at least, in English): sample. It would be good if these were changed back to "Category:".
--- Jura 10:13, 26 August 2017 (UTC)

I submitted a QuickStatements batch for English labels. I was able to generate it via Petscan label generator. Matěj Suchánek (talk) 10:47, 26 August 2017 (UTC)
:This section was archived on a request by: Thanks! 
--- Jura 11:57, 27 August 2017 (UTC)

Undo edits

What's an easy way to undo my edits of today on a series of items?
--- Jura 17:19, 26 August 2017 (UTC)

@Jura1:
1) See @RollBot: (some details can be found in Wikidata:Requests for permissions/Bot/RollBot). You can contact the bot-op, maybe they could help.
2) As you have rollbacker flag, you can use some mass-rollback script (especially if the edits to revert are consecutive). I've tried a few of those I found and I stick to c:User:~riley/MRollback.js (a modified version of en:User:Kangaroopower/MRollback.js; it should work on WD as well, I suppose). --XXN, 22:03, 26 August 2017 (UTC)
  • Thanks @XXN: for the suggestions. I tried MRollback.js, but somehow it didn't work out. It gave me the idea to search for similar scripts and I found meta:User:Hoo_man/Scripts/Smart_rollback. The only downside was that it rolled back other edits of mine when there were some before. Anyways, it's more or less sorted out now.
This section was archived on a request by:
--- Jura 11:56, 27 August 2017 (UTC)

Fixing redirects in statements

Request date: 28 August 2017, by: Jonathan Groß

Task description

Could somebody please change all instances of occupation (P106)classical philologist (Q16267607) to occupation (P106)classical philologist (Q12716126)? Q16267607 has been merged with classical philologist (Q12716126). Thanks in advance! Jonathan Groß (talk) 09:07, 28 August 2017 (UTC)

Discussion


Request process
This section was archived on a request by: XXN, 10:11, 28 August 2017 (UTC)

non-standard Russian descriptions of categories

Can anyone change all ru-descriptions "Категория в Русской Википедии" (like in Q26772396) to standard "категория в проекте Викимедиа" please? --Infovarius (talk) 21:20, 28 August 2017 (UTC)

  Done. Fixed also some bad Belarusian and Ukrainian descriptions. --XXN, 22:47, 28 August 2017 (UTC)
This section was archived on a request by: XXN, 23:00, 28 August 2017 (UTC)

local bot to migrate interwiki links

Request date: 19 August 2017, by: BukhariSaeed

Link to discussions justifying the request
Task description

We have no local bot to migrate interwiki links especially categories on Urdu wikipedia,EmausBot worked first but now does not. Kindly resolve this problem And enable EmausBot or any other, Thanks.BukhariSaeed (talk) 06:01, 19 August 2017 (UTC)

Discussion
@Ladsgroup, JAn Dudík: Matěj Suchánek (talk) 16:48, 24 August 2017 (UTC)
Request process

Accepted by (JAn Dudík (talk) 19:51, 24 August 2017 (UTC)) and under process, connecting pages with local interwiki from ur:Special:UnconnectedPages
Task almost completed (21:14, 25 August 2017 (UTC)) - there are some pages with interwiki which leads to redirect to another item. But categories and templates with interwiki should be all imported. JAn Dudík (talk) 21:14, 25 August 2017 (UTC)

thanks for this bot on urdu wikipedia. Can you work continuously or permanently on urdu wikipedia like EmausBot to migrate interwiki links especially categories? BukhariSaeed (talk) 01:32, 27 August 2017 (UTC)
This section was archived on a request by: Matěj Suchánek (talk) 07:50, 30 September 2017 (UTC)

Remove duplicate aliases

Sometimes when a merge is conducted, an alias is created which is the same as the label, in a given language; for example this merge added the English alias "Carolyn A Young".

Can a bot look for such cases, and remove the alias? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:50, 26 August 2017 (UTC)

It seems that User:PLbot is now doing this. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:42, 1 October 2017 (UTC)
This section was archived on a request by: Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:42, 1 October 2017 (UTC)

Dates authors of papers "flourished"

Colleagues are currently doing great work, adding many items about scientific papers, and the people who wrote them. For the latter, for various good reasons, we often have no birth (or death, where applicable) date. Many have written more than one paper. An example is Rene Tänzler (Q19966978).

I think we should have a bot which looks at each of an author's papers, where the author has no birth or death date listed, and extracts the dates, then takes the first and last date, and adds "flourished" dates (work period (start) (P2031), work period (end) (P2032)) to the author's item, giving the source as an item which represents, say, "date calculated from paper in Wikidata". An occasional update, in case further papers have been added in the interim, would also be a good idea. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:27, 31 August 2017 (UTC)

This query shows the time difference between work period (start) (P2031) and the publication date of the first publication stored on WD. And this query is for work period (end) (P2032) and the last publication. --Pasleim (talk) 12:07, 1 October 2017 (UTC)
@Pasleim: Thanks! After seeing that I am even more convinced that automatic feeding data would be a poor idea. Q79822 would end with "flourishing" 94 years after his death Mateusz Konieczny (talk) 17:12, 1 October 2017 (UTC)
One more problem - what about works published after death of author? Or at time that somebody was for example ill and no longer working, but some his/her works were still published for the first time? The Silmarillion (Q79762) is just one example Mateusz Konieczny (talk) 09:26, 8 October 2017 (UTC)

Request withdrwan. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:42, 9 October 2017 (UTC)

This section was archived on a request by: Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:42, 9 October 2017 (UTC)

consistency about trailing "/" in URL

Request date: 12 August 2017, by: D1gggg

from project chat and from talk page Property_talk:P856#Please_normalize.3F
Task description
  • if most values have "/" -> fill missing "/"
  • if most don't have "/" -> remove last "/"
Discussion
  Comment 55% of all official website (P856) use a trailing slash [1]. Adding trailing slash to the other 45% of URLs can result in errors in case the URL was pointing to a file and with the slash it will then point to a non-existant folder, e.g. on Q15735628#P856. Removing trailing slashs results in longer loading times and depending on the server configuration it can also break URLs. I don't think it is worth the trouble to add "consistency" here of something technically no consistency can exist. --Pasleim (talk) 19:45, 13 August 2017 (UTC)
To execute the request just add to property talk page template:
{{Autofix|pattern=<nowiki>(https?://[\w\.]+)</nowiki>|replacement=\1/}}
to fill missed "/" or
{{Autofix|pattern=<nowiki>(https?://[\w\.]+)/</nowiki>|replacement=\1}}
to remove it. Filling will change 191101 values, removing — 281365 values. — Ivan A. Krestinin (talk) 17:59, 19 August 2017 (UTC)
  Oppose per Pasleim. This is going to break some URLs. − Pintoch (talk) 14:00, 2 October 2017 (UTC)
I resolved the request given that the proposer is now banned and doing the task would likely break some URLs. ChristianKl () 12:23, 13 December 2017 (UTC)
Request process
This section was archived on a request by: ChristianKl () 12:23, 13 December 2017 (UTC)

items for segments

For many anthology film (Q336144), it can be worth creating an item for each segment (sample: Q16672466#P527). Such items can include details on director/cast/etc as applicable (sample: Q26156116).

The list of anthology films includes already existing items.

This task is similar to #duos_without_parts above.
--- Jura 10:14, 22 April 2017 (UTC)

What source could the bot use? Matěj Suchánek (talk) 07:25, 25 August 2017 (UTC)
Good question. I checked the first 12 films on list above and about half had a WP article detailing the episodes.
For automated import, the structure of these articles might not be sufficiently standard:
  • section header per episode (ru), (pl)
  • table with names of episodes (de),
  • section with more details for each episode (es), (ru), (ca)
Maybe a user would need to gather the list for each film and a tool would create segment items. I guess I could try that on a spreadsheet.
--- Jura 08:49, 25 August 2017 (UTC)