Open main menu
Bot requests
If you have a bot request, add a new section using the button and tell exactly what you want. To reduce the process time, first discuss the legitimacy of your request with the community in the Project chat or in the Wikiprojects's talk page. Please refer to previous discussions justifying the task in your request.

For botflag requests, see Wikidata:Requests for permissions.

Tools available to all users which can be used to accomplish the work without the need for a bot:

  1. PetScan for creating items from Wikimedia pages and/or adding same statements to items
  2. QuickStatements for creating items and/or adding different statements to items
  3. Harvest Templates for importing statements from Wikimedia projects
  4. Descriptioner for adding descriptions to many items
  5. OpenRefine to import any type of data from tabular sources
On this page, old discussions are archived. An overview of all archives can be found at this page's archive index. The current archive is located at 2019/03.
Filing cabinet icon.svg
SpBot archives all sections tagged with {{Section resolved|1=~~~~}} after 2 days.

You may find these related resources helpful:

High-contrast-document-save.svg Dataset Imports    High-contrast-view-refresh.svg Why import data into Wikidata.    Light-Bulb by Till Teenck.svg Learn how to import data    Noun project 1248.svg Bot requests    Question Noun project 2185.svg Ask a data import question

Contents

Redirects after archivalEdit

Request date: 11 September 2017, by: Jsamwrites (talkcontribslogs)

Link to discussions justifying the request
Task description

Retain the links to the original discussion section on the discussion pages, even after archival by allowing redirection.

Licence of data to import (if relevant)
Discussion


Request process

Semi-automated import of information from Commons categories containing a "Category definition: Object" templateEdit

Request date: 5 February 2018, by: Rama (talkcontribslogs)

Link to discussions justifying the request
Task description

Commons categories about one specific object (such as a work of art, archaeological item, etc.) can be described with a "Category definition: Object" template [1]. This information is essentially a duplicate of what is or should be on Wikidata.

To prove this point, I have drafted a "User:Rama/Catdef" template that uses Lua to import all relevant information from Wikidata and reproduces all the features of "Category definition: Object", while requiring only the Q-Number as parameter (see Category:The_Seated_Scribe for instance). This template has the advantage of requesting Wikidata labels to render the information, and is thus much more multi-lingual than the hand-labeled version (try fr, de, ja, etc.).

I am now proposing to deploy another script to do the same thing the other way round: import data from the Commons templates into relevant fields of Wikidata. Since the variety of ways a human can label or mislabel information in a template such as "Category definition: Object", I think that the script should be a helper tool to import data: it is to be ran on one category at a time, with a human checking the result, and correcting and completing the Wikidata entry as required. For now, I have been testing and refining my script over subcategories of [2] Category:Ship models in the Musée national de la Marine. You can see the result in the first 25 categories or so, and the corresponding Wikidata entries.

The tool is presently in the form of a Python script with a simple command-line interface:

./read_commons_template.py Category:Scale_model_of_Corse-MnM_29_MG_78 reads the information from Commons, parses it, renders the various fields in the console for debugging purposes, and creates the required Wikibase objects (e.g: text field for inventory numbers, Q-Items for artists and collections, WbQuantity for dimensions, WbTime for dates, etc.)
./read_commons_template.py Category:Scale_model_of_Corse-MnM_29_MG_78 --commit does all of the above, creates a new Q-Item on Wikidata, and commits all the information in relevant fields.

Ideally, when all the desired features will be implemented and tested, this script might be useful as a tool where one could enter the

Licence of data to import (if relevant)

The information is already on Wikimedia Commons and is common public knowledge.

Discussion


Request process

Remove statement with Gregorian date earlier than 1584 (Q26961029)Edit

SELECT ?item ?property (YEAR(?year) as ?yr)
{
    hint:Query hint:optimizer "None".
    ?a pq:P31 wd:Q26961029 .
    ?item ?p ?a .
    ?a ?psv ?x .
    ?x wikibase:timeValue ?year .
    ?x wikibase:timePrecision 7 .        
    ?x wikibase:timeCalendarModel wd:Q1985727 .        
    ?property wikibase:statementValue ?psv    .   
    ?property wikibase:claim ?p     .  
}
LIMIT 15704

Try it!

The above dates have year precision and Proleptic Gregorian calendar (Q1985727) as calendar model. I think they could be converted to Julian and the qualifier statement with Gregorian date earlier than 1584 (Q26961029) removed.
--- Jura 09:19, 24 February 2018 (UTC)

  Support --Marsupium (talk) 23:01, 28 April 2018 (UTC)
Presumably some such statements are actually intended to be Gregorian year, no? --Yair rand (talk) 02:56, 5 September 2018 (UTC)
Sample? --- Jura 05:52, 4 December 2018 (UTC)
I checked some Lunar Eclipses from Antiquity (a kind of event where we can compute accurate dates) and the dates where all stated in the Julian calendar and correctly entered into Wikidata. --Pyfisch (talk) 12:50, 21 December 2018 (UTC)

Crossref JournalsEdit

Request date: 27 March 2018, by: Mahdimoqri (talkcontribslogs)

Link to discussions justifying the request
Task description
  • Add missing journals from Crossref
Licence of data to import (if relevant)
Discussion


Request process

Normalize referencesEdit

Request date: 13 July 2018, by: Marsupium (talkcontribslogs)

Link to discussions justifying the request
Task description

Often one source website or database is indicated inconsistently in various manners. To improve this situation some queries and following edits on references could be made. This is a task that would best be done continuously and gradually adapted to more cases. Thus, perhaps this task fits well in the work field of DeltaBot, User:Pasleim?

  1. Add ID property (and if feasible also stated in (P248)) to references with according reference URL (P854) where missing.
  2. Add stated in (P248) to references with according ID property where missing.
  3. (For later: Merge references where a source website or database is used twice (accidentally).)

The issue exits for ULAN ID (P245), RKDartists ID (P650) and probably many more source websites or databases. I have examined ULAN ID (P245) and those would be the queries and edits to be done:

  1. SELECT ?entity ?prop ?ref ?id WHERE {
      ?entity ?prop ?statement.
      ?statement prov:wasDerivedFrom ?ref.
      ?ref pr:P854 ?refURL.
      MINUS { ?ref pr:P245 []. }
      FILTER REGEX(STR(?refURL), "^https?://(vocab.getty.edu/(page/)?ulan/|www.getty.edu/vow/ULANFullDisplay.*?&subjectid=)")
      BIND(REPLACE(STR(?refURL),"^https?://(vocab.getty.edu/(page/)?ulan/|www.getty.edu/vow/ULANFullDisplay.*?&subjectid=)","") AS ?id)
    }
    LIMIT 500
    
    Try it! → add    / ULAN ID (P245)?id to the reference (now >1k cases)
  2. SELECT ?entity ?prop ?ref WHERE {
      ?entity ?prop [ prov:wasDerivedFrom ?ref ].
      ?ref pr:P245 [].
      MINUS { ?ref pr:P248 wd:Q2494649. }
    }
    
    Try it! → add    / stated in (P248)Union List of Artist Names (Q2494649) to the reference (now ca. 70 cases)

Thanks for any comments!

Discussion


Request process

I'm working on it myself now, see Topic:Unj4mc05g2qpj1gs for the process. Any help or advice still welcome! --Marsupium (talk) 18:20, 30 October 2018 (UTC)

elevation above sea level (P2044) values imported from ceb-WikiEdit

Request date: 6 September 2018, by: Ahoerstemeier (talkcontribslogs)

Link to discussions justifying the request
  • Many items have their elevation imported from the Cebuano-Wikipedia. However, the way the bot created the values is very faulty, especially due to inaccurate coordinates the value can differ by up to 500m! Thus most of the values are utter nonsense, some are a rough approximation, but certainly not good data. To make things worse - the qualifier with imported from Wikimedia project (P143) often wasn't added. For an extreme example see Knittelkar Spitze (Q1777201).
Task description

Firstly, a bot has to add all the missing imported from Wikimedia project (P143) omitted in the original infobox harvesting. Secondly, especially for mountains and hills, the value has to be set to deprecated state, to avoid it to poison our good date.

Licence of data to import (if relevant)
Discussion


Request process

Import and maintain nominal GDP for countries from the World Bank Data APIEdit

Request date: 17 September 2018, by: WDBot (talkcontribslogs)

Link to discussions justifying the request
Task description

A bot to load nominal GDP from the WorldBank API and write it to WikiData countries (property https://www.wikidata.org/wiki/Property:P2131).

  1. load the country information (retrieved from query.wikidata.org and copy-pasted in the script)
  2. iterate over each country
  3. check if data - Nominal GDP in US-Dollar - on WorldBank is available - if not go to the next country
  4. load the first value of wb data
    1. check over all nominal gdp properties if the value is available
    2. skip if value is available skip
    3. write if value is not available

Code: link to code[[3]]

You can find test edits here:

  • Bulgaria (example when there is no data): [[4]]
  • Germany (example with only one missing value for the year 2000): [[5]]
Licence of data to import (if relevant)

CC BY-4.0 - see here https://datacatalog.worldbank.org/public-licenses#cc-by

Discussion
I think the references should also have a property pointing to either the World Bank or its database, rather than only pointing to the URL. Maybe publisher (P123) or published in (P1433). --Yair rand (talk) 21:07, 17 September 2018 (UTC)
Hi Yair rand and thank you for your feedback. I have adjusted the script to write "publisher" too. Here you can see the example for France and USA on test.wikidata.org. You can see the new script here. --WDBot (talk) 20:24, 18 September 2018 (UTC)
If you need the approval for the bot, you're looking for Wikidata:Requests for permissions/Bot — regards, Revi 06:27, 19 September 2018 (UTC)
Thank you, Revi. I have now created a request. Cheers! --WDBot (talk) 18:07, 20 September 2018 (UTC)
Request process

Create indiscriminately new itemsEdit

Apparently, users are looking forward to this. Per Wikidata:Requests_for_permissions/Bot/GZWDer_(flood)_4, GZWDer_(flood) was approved for this. As its operator isn't currently active, maybe someone else wants to do it. --- Jura 16:42, 20 September 2018 (UTC)

Updating templates' description in RussianEdit

Request date: 7 October 2018, by: Wikisaurus (talkcontribslogs)

Link to discussions justifying the request

Looks obvious.

Task description

Most templates has standard Russian description "шаблон проекта Викимедиа", per d:Q11266439, but quite a number of them have old descriptions "шаблон в проекте Викимедиа" and "шаблон проекта Викимедия". Please replace them with the modern description. I can not do it myself with QuickStatements because SPARQL request are giving me only a small portion of results - if I replace limit 100 by large limit they collapse. SPARQL requests are below (they are a bit strange because there are some problems with cyrillic, I believe):

SELECT ?item ?itemlab ?itemdesc WHERE {
  ?item wdt:P31 wd:Q11266439 .
  wd:Q6537516 schema:description ?wrongdesc1
  filter (lang(?wrongdesc1) = "ru") .
  OPTIONAL { ?item schema:description ?itemdesc
  filter (lang(?itemdesc) = "ru") }
  filter (?itemdesc = ?wrongdesc1)
} LIMIT 100

Try it!

SELECT ?item ?itemlab ?itemdesc WHERE {
  ?item wdt:P31 wd:Q11266439 .
  wd:Q6459244 schema:description ?wrongdesc1
  filter (lang(?wrongdesc1) = "ru") .
  OPTIONAL { ?item schema:description ?itemdesc
  filter (lang(?itemdesc) = "ru") }
  filter (?itemdesc = ?wrongdesc1)
} LIMIT 100

Try it!

Licence of data to import (if relevant)
Discussion
The query can be as simple as:
SELECT ?item ?desc {
  VALUES ?desc { "шаблон в проекте Викимедиа"@ru "шаблон проекта Викимедия"@ru } .
  ?item schema:description ?desc .
}
Try it!
Matěj Suchánek (talk) 16:47, 8 October 2018 (UTC)
It looks like magic and works in a moment. Matej, thank you! Wikisaurus (talk) 20:43, 12 October 2018 (UTC)
Request process

Add interwiki conflicts listings to talk pagesEdit

Request date: 23 October 2018, by: Yair rand (talkcontribslogs)

Task description

I think it would be helpful if items listed at Wikidata:Interwiki conflicts had notices and links placed on their talk pages, to improve discoverability. There are over a thousand items listed, and no easy way for a user to determine whether any individual item is listed there. --Yair rand (talk) 04:43, 23 October 2018 (UTC)

Discussion


Request process


Copy lemma to F1Edit

For lexemes without forms, could a bot copy the lemma to this form? Sample edit: https://www.wikidata.org/w/index.php?title=Lexeme:L8896&diff=772695679&oldid=772692662

Please skip any lexemes that already have forms. --- Jura 08:51, 25 October 2018 (UTC)


Add annual country level unemployment rate (P1198)Edit

It would be interesting to have annual data for each country (1 value per year for country items). I'm not sure what are the most suitable sources for each country.

When discussing a query with CalvinBall, I noticed that Q30#P1198 currently only has one value (for 2013). --- Jura 12:34, 26 October 2018 (UTC)

Hi Jura, we could use the WDBot to do this job. The source could be World Bank Data - here an example for the USA: https://data.worldbank.org/indicator/SL.UEM.TOTL.ZS?locations=US. The World Bank uses ILO estimates, which have the following nice properties (check the Details button for the indicator on the WB's page):
Statistical Concept and Methodology: [...] The standard definition of unemployed persons is those individuals without work, seeking work in a recent past period, and currently available for work, including people who have lost their jobs or who have voluntarily left work. Persons who did not look for work but have an arrangements for a future job are also counted as unemployed. Some unemployment is unavoidable. At any time some workers are temporarily unemployed between jobs as employers look for the right workers and workers search for better jobs. It is the labour force or the economically active portion of the population that serves as the base for this indicator, not the total population. The series is part of the ILO estimates and is harmonized to ensure comparability across countries and over time by accounting for differences in data source, scope of coverage, methodology, and other country-specific factors. The estimates are based mainly on nationally representative labor force surveys, with other sources (population censuses and nationally reported estimates) used only when no survey data are available..
If this is fine for you I would make a request for bot permission (the script is already available). Datawiki30 (talk) 15:50, 26 October 2018 (UTC)
  • Sounds good. Maybe the qualifier criterion used (P1013) could be used with an item that describes the applied methodology. Eventually, we might have numbers with different methodologies for the same year. --- Jura 16:04, 26 October 2018 (UTC)
Hi Jura and thank you for your feedback. Do we really need the additional qualifier? Similar to the GDP I would just use the "stated in" = World Bank database and "reference URL" = https://data.worldbank.org/indicator/SL.UEM.TOTL.ZS?locations=XX where XX is the ISO code of the country. My opinion is that qualifier trying to explain the data are too short to describe the method behind the data... For new methods I would just suggest to propose a new property (like there are different properties for total, male and female population); Cheers! Datawiki30 (talk) 16:36, 26 October 2018 (UTC)
I think it's useful. The value would just be a (new) specific item. There is no need for its label or description to include the full text. I think it's an advantage as the above numbers may be useful for cross-country comparison, but some users might be looking for just one country and expect the methodology preferred in the country. For your bot it might be possible to do all in one edit, so the additional work would be marginal. --- Jura 16:44, 26 October 2018 (UTC)
Thank you Jura for your comment. There are no technical obstacles - the bot can handle this. I suppose that you mean, that we could have structurally different values in the same property - for example the value from ILO could for country A could be for example 5% where for the same years and country the value from Eurostat could be 3 %. Is this the case? Datawiki30 (talk) 21:29, 26 October 2018 (UTC)
  • Yes. I had in mind mainly national agencies that might have 4% instead of 10% (by whatever method), but it's the same issue. --- Jura 16:05, 2 November 2018 (UTC)
@Jura1: OK. Could you please take a look here? After a discussion in the project chat and here I think that it would be the best to import only the most actual data (for example for 2017). Otherwise we could have problems with the loading time of the countries pages. I would be glad to see your comment there. Cheers! Datawiki30 (talk) 19:18, 12 November 2018 (UTC)
  • This isn't really helpful to look at the evolution. I think it could easily hold annual data for the last 50 years. There are several other properties that have annual data. --- Jura 04:24, 13 November 2018 (UTC)

BLKÖEdit

Most pages in https://de.wikisource.org/wiki/Kategorie:BLK%C3%96 (27209 pages) seem to lack items (http://petscan.wmflabs.org/?psid=6382466 , currently 26641 pages).

I think it would be worth creating them as well as an item for the person subject of the article if it can't be matched with one of the exisiting items. --- Jura 07:43, 8 November 2018 (UTC)

ProposalEdit

To get this started I propose this structure for articles. It also mentions from which source each statement is imported. As I see it besides the structure for articles the structure for volumes and person subjects with imported data also needs to be decided. Additionally described by source (P1343) should probably be added to new and existing person subjects. --Pyfisch (talk) 22:29, 11 December 2018 (UTC)

ArticleEdit


I've made a preliminary data export. It contains all BLKÖ articles with GND, Bearbeitungsstand etc. The articles are linked based on the stated GND, Wikipedia and Wikisource articles, if there was a conflict multiple Q-numbers are given. I also searched for items linked to the article and unfortuanly found many that describe the person instead the of the text (they will need to be split). The last four columns state the date/place of birth/death from the text. The dates vary in accuracy:
  • year-month-day, year-month, only year
  • ~ before date describes imprecise dates
  • > before describes dates stated as "nach 1804"
  • A before dates describes "Anfang/erste Tage" start of
  • E before dates describes "Ende/letzte Tage" end of
  • M before dates describes "Mitte" middle of
  • ? BLKÖ knows the person was dead but does not know when he/she died

The places will need to be manually matched to Q-items. The first column contains some metadata about the kind of page. There are:

  • empty: Person
  • L: Liste
  • F: Family, Wappen, Genealogie
  • R: Cross Reference
  • P: Prelude
  • H: note about names and alternate spellings
  • N: corrections, Nachträge

Each group should get a distinct is-a property. @Jura1: Do you like it? This is just for viewing, a later version will be editable to make manual changes before the import. --Pyfisch (talk) 22:14, 18 December 2018 (UTC)

  • I like the approach. BTW, there is Help:Dates that attempts to summarize how to add incomplete dates. --- Jura 14:05, 20 December 2018 (UTC)
    • editable data export. Updated the exported data. The sheet "articles" is already cleaned up. But I need help to match the ~4000 place names in the sheet "places" to Wikidata Q-Items. --Pyfisch (talk) 16:07, 22 December 2018 (UTC)

Clinical TrialsEdit

Request date: 8 November 2018, by: Mahdimoqri (talkcontribslogs)

Link to discussions justifying the request

https://www.wikidata.org/w/index.php?title=Wikidata:Dataset_Imports/Clinical_Trials*

Task description
Licence of data to import (if relevant)
Discussion


Request process

trainer-stationsEdit

Request date: 12 November 2018, by: Fundriver (talkcontribslogs)

Task description

Is it possible to harvest the trainer-data for head coach of sports team (P6087) out of the german Wikipedia out of different infoboxes? It should be pretty similar to the harvesting for member of sports team (P54) and could be done with the same syntax for different sports, because the infoboxes are similar for the different sports in the german Wikipedia (expect for ice hockey): You could use trainer_tabelle in Template:Infobox Rugby Union biography (Q14373909), Template:Infobox football biography (Q5616966), Template:Infobox basketball biography (Q5831659) and Template:Infobox floorball player (Q20963207) with the same technic. You probably just should pay attention to don't import data, that isn't totally clear. So sometimes you have a "(Co-Tr.)", "U-21" or "U21" in addition to the Wikilink that need manual oversight. But per example a "(Co-Tr)" you could use to refine a statement, if this is possible. Fundriver (talk) 09:52, 12 November 2018 (UTC)

Licence of data to import (if relevant)
Discussion


Request process

Cleanup VIAF datesEdit

Task description

There are a series of imports of dates that need to be fixed, please see Topic:Un0f1g1eylmopgqu and the discussions linked there, notably Wikidata:Project_chat/Archive/2018/10#Bad_birthdays with details on how VIAF formats them. --- Jura 05:28, 14 November 2018 (UTC)

Licence of data to import (if relevant)
Discussion

Is anyone interested in working on this problem? I think it's a real issue, but it needs attention from someone who can parse the VIAF records and that's certainly not me. - PKM (talk) 21:33, 16 March 2019 (UTC)

import writersEdit

When adding values for screenwriter (P58), I notice that frequently these persons don't have Wikidata items yet.

It would be helpful to identify a few sources for these and create corresponding items. Ideally every tv episode would have its writers included. --- Jura 15:05, 18 November 2018 (UTC)


adding data from scoresway.comEdit

Request date: 22 November 2018, by: Amirh123 (talkcontribslogs) hi please adding player datas of scoresway.com

Link to discussions justifying the request
Task description
Licence of data to import (if relevant)
Discussion
@Amirh123: the license of this site doesn't seem to allow import. Cheers, VIGNERON (talk) 13:46, 12 February 2019 (UTC)

Import Schizosaccharomyces pombe protein coding genesEdit

Request date: 6 December 2018, by: Anlock (talkcontribslogs)

Link to discussions justifying the request
Task description

The PomBase database manually curate and maintain the coding inventory of the S. pombe genome. I would like to upload the protein coding genes of S. pombe as per this request https://www.wikidata.org/wiki/Wikidata:Property_proposal/PomBase_systematic_ID

The dataset is located here: https://docs.google.com/spreadsheets/d/1nrFcoQJirshUYbgI8-O3sjIDUonDHM_gLClJrrm3zZY/

Licence of data to import (if relevant)

Creative Commons Attribution 4.0 International license (CC-BY)

Discussion


Request process

Fix item redirects in grammatical features and other aspects of LexemesEdit

Request date: 10 December 2018, by: ArthurPSmith (talkcontribslogs)

Link to discussions justifying the request
Task description

In general if an item has been redirected in the main namespace, right now I believe User:PLbot fixes statements with that item value. However, this does not seem to be happening for grammatical features or other aspects of Lexemes - in particular when we merged Q24133704 into present participle (Q10345583) that left (thousands?) of lexeme forms with that old grammatical feature value; it would be really nice to have a bot fix this (and watch for similar things in future)!

Licence of data to import (if relevant)
Discussion

Actually it's User:KrBot that normally fixes redirects. Ideally that bot would just be updated to handle the special ways that lexemes use Wikidata items? ArthurPSmith (talk) 19:16, 10 December 2018 (UTC)

Request process

Add original title of scientific articlesEdit

There are some articles, that have title (P1476) value enclosed in square bracket. This means that the title is translated to English and the article's title wasn't in English.

Sample: https://www.wikidata.org/w/index.php?title=Q27687073&oldid=555470366

Generally, the following should be done:

  1. deprecate existing P1476 statement
  2. add the original title with title (P1476)
  3. add the label in the original language
  4. remove [] from the English label

--- Jura 11:03, 11 December 2018 (UTC)

Reviews in articlesEdit

When doing checks on titles, I found some items with P31=scholarly article (Q13442814) include an ISBN in the title (P1476)-value.

Sample: Q28768784.

Ideally, these would have a statement main subject (P921) pointing to the item about the work. --- Jura 19:10, 13 December 2018 (UTC)

DiscussionEdit

@Jura1: I’ve been manually cleaning up a few of these. Some comments on the process from my perspective:

- PKM (talk) 01:33, 4 March 2019 (UTC)

    • Sure, it's possible to take this a step further. --- Jura 11:09, 10 March 2019 (UTC)

Marking as preferred the current time zone in Property:P421Edit

Request date: 14 December 2018, by: Antenor81 (talkcontribslogs)

Link to discussions justifying the request
Task description

The request is to mark as preferred the current time zone in Property:P421 when it contains also old zones that are no longer applicable. This is useful because the other wiki-projects are not able to import the current time zone if it has not a superior rank.

Licence of data to import (if relevant)
Discussion
  • Now that I re-read this, is this about daylight savings time or time zones in general? You haven't specified any changes that should be done.
    • If for a country or a region, the timezone changes, you could just add an end date for the timezone and a start date for the new one. Maybe @PreferentialBot: can then set preferred rank as it does already for other properties.
    • For DST, maybe we should put this in another (new) property.
-- Jura 14:09, 20 December 2018 (UTC)

@Antenor81, T.seppelt, Nikosguard, Rachmat04, علاء: @ShinePhantom, ViscoBot, Vyom25, Liridon: fyi --- Jura 06:52, 21 December 2018 (UTC)

I mean time zone in general. For example, in this case there are two time zones, the old one (UTC+3, end date 26 mar 2016) and the new one (UTC+4, start date 26 mar 2016). I noticed in it.wikipedia that the system was not able to import the time zone from Wikidata because both time zones had the same rank (normal). In the same example, now the current time zone has a preferred rank and the old time zone has a normal rank, and now everything is working. So, if we want the other wiki-projects to import the current time zone, it should have a superior rank.--Antenor81 (talk) 07:59, 23 December 2018 (UTC)
Request process

Patronage/clientèle patronage (P3872), rank-preferred for latest year availableEdit

Request date: 1 January 2019, by: Bouzinac (talkcontribslogs)

Link to discussions justifying the request
Task description

Update any element with P3872, if there is (1+) years, up (preferred)-rank the latest year. Down (normal)-rank other years if present. For instance, see

And this should be executed one time per year (as there might be new data) Thanks a lot!

Licence of data to import (if relevant)
Discussion
Request process


request for a bot to import population dataEdit

Request date: 10 January 2019, by: Histobot (talkcontribslogs)

Link to discussions justifying the request
Task description

Import municipal population data of Dutch municipalities from the Statistics Netherlands https://www.cbs.nl open data portal from the dataset: https://opendata.cbs.nl/ODataApi/odata/37259ned. Specifically add population data from 1960 to 2017 to every municipality using the property population. This will facilitate the use of reliable and consistent population data in other projects. Look for instance at the municpality of Zwolle and its population data. I will try to add this data using openrefine.

Licence of data to import (if relevant)

CC BY 4.0

Discussion

Histobot (talk) 15:56, 10 January 2019 (UTC)

Request process

Auto-adding complementary valuesEdit

Request date: 11 January 2019, by: Jc86035 (talkcontribslogs)

Link to discussions justifying the request
Task description

There should be a bot to add complementary values for Genius artist ID (P2373), Genius album ID (P6217) and Genius song ID (P6218). For all Genius artist ID (P2373) values without Genius artist numeric ID (P6351), the bot should add the first match of regex \{"name":"artist_id","values":["(\d+)" in the linked page, and vice versa with the first match of regex "slug":"([0-9A-Z][0-9a-z-]*[0-9a-z]|[0-9A-Z])". Preferred and deprecated ranks should be inferred when adding new values, although if multiple statements to be added have the same value but different rank then only the statement with the higher rank should be used. The values should be periodically checked to see if they match, and errors should be reported somewhere (probably on-wiki). The same should also be implemented for the other two pairs of properties, Genius album ID (P6217)/Genius artist numeric ID (P6351) and Genius song ID (P6218)/Genius song numeric ID (P6361).

Licence of data to import (if relevant)

N/A (presumed not copyrightable)

Discussion

All of the properties now exist. Jc86035 (talk) 10:57, 15 January 2019 (UTC)

Request process

Adding main subject (P921) to scholarly articles based on relevant keywords in the title and descriptionEdit

Request date: 17 January 2019, by: Thibdx (talkcontribslogs)

Task description

The goal of this bot is to add main subject (P921) to scholarly articles.

The metadata of scholarly articles in Wikipedias are quite hard to maintain by hand because the rate of creation of these articles exceed the capacity of the community to generate data. So that automation would be a great help.

In many case, finding a specific keywords on scholarly articles makes it obvious that it is a main subject of the article. This is the case for most technical terms that does not have double meaning.

For example :

A list of such pairs could be stocked in a protected wikipage. For each Keyword, the bot would search in scholarly articles and add the related main subject (P921) statement if the keyword is in the title. Of course, each pair would have to be tested first to ensure data consistency.

Human readable algorithme 
Wikidata:WikiProject Materials/ScholarTopicsBot
Getting the work done

If an experienced dev thinks it could be one of its priorities I would be glad to handle this to him. If not, I can try to do it myself. I'm not a dev at all. The only thing I did so far is modifiying some scripts. So that if you can help me by pointing the following examples it would be helpfull :

  • A bot that extract content from a wikipage
  • A bot that list Qids using a request
  • A bot that add statements to items

Regards

Discussion
Request process

Fix author ranks on The Morphology of Steve (Q50422077)Edit

It seems that most people listed there weren't actually authors (beyond #3, if I recall the footnote in the paper correctly). Accordingly ranks of "author" and "author name" statements should be set to deprecated rank. @Daniel Mietchen: fyi --- Jura 14:49, 3 February 2019 (UTC)

  Done Special:Diff/858789102. Matěj Suchánek (talk) 14:45, 15 February 2019 (UTC)

Import Treccani IDsEdit

Request date: 6 February 2019, by: Epìdosis (talkcontribslogs)

Task description

At the moment we have four identifiers referring to http://www.treccani.it/: Dizionario biografico degli italiani Identifier (P1986), Treccani ID (P3365), Enciclopedia Italiana ID (P4223), Dizionario di Storia Treccani ID (P6404). Each article of these works has, in the right column "ALTRI RISULTATI PER", a link to the articles regarding the same topic in other works (e.g. Ugolino della Gherardesca (Q706003)Treccani ID (P3365)  conte-ugolino, http://www.treccani.it/enciclopedia/conte-ugolino/ has links also to Enciclopedia Italiana (Enciclopedia Italiana ID (P4223) and Dizionario di Storia (Dizionario di Storia Treccani ID (P6404)). This cases are extremely frequent: many items have Dizionario biografico degli italiani Identifier (P1986) and not Treccani ID (P3365)/Enciclopedia Italiana ID (P4223); others have Treccani ID (P3365) and not Enciclopedia Italiana ID (P4223); nearly no item has Dizionario di Storia Treccani ID (P6404), recently created.

My request is: check each value of these identifiers in order obtain values for the other three identifiers through the column "ALTRI RISULTATI PER".

Discussion

Import alumni based on Wikipedia categoriesEdit

Request date: 10 February 2019, by: GerardM (talkcontribslogs)

Task description

Categories with "category contains" "Human" and "Educated at" "Whatever institution" are to be used to include education information in Wikidata. For many universities these categories have been initially imported manually.

Licence of data to import (if relevant)
Discussion


Request process
  • I want to revive my RobotGMwikt profile to run this. Thanks, GerardM (talk) 16:36, 10 February 2019 (UTC)
    So do. Matěj Suchánek (talk) 14:34, 15 February 2019 (UTC)

Golf video gamesEdit

Request date: 13 February 2019, by: Trade (talkcontribslogs)

Link to discussions justifying the request
Task description

Can someone please add the video game genre golf video game (Q60256879) to all video games in Category:Golf video games (Q8494058)? Trade (talk) 18:28, 13 February 2019 (UTC)

Licence of data to import (if relevant)
Discussion


Request process

Fuzhou Architecture HeritageEdit

Request date: 26 February 2019, by: Davidzdh (talkcontribslogs)

Link to discussions justifying the request
Task description
Licence of data to import (if relevant)
Discussion

Can anyone help me to import these data? Thank you.- I am Davidzdh. 17:31, 26 February 2019 (UTC)

Hi you'r welcome, I see you have already created a Google Spreadsheet. Can you move the data around so the format matches this spreadsheet? Then you can export the spreadsheet as CSV and upload it to QuickStatements and import the data yourself. Please tell me if you encounter any problems. --Pyfisch (talk) 10:19, 2 March 2019 (UTC)
Request process

wiktionaryEdit

hi please add wiktionary links to wikidata items example iran not any links in wiktionary Amirh123 (talk) 19:06, 26 February 2019 (UTC)

Wiktionary handles interwiki differently. Matěj Suchánek (talk) 19:37, 26 February 2019 (UTC)

Update P373 in several elementsEdit

Request date: 14 March 2019, by: Syrio (talkcontribslogs)

Link to discussions justifying the request

No discussion on this specific task (consensus was reached on Commons for the movements, of course), but it's just maintenance.

Task description

Hello! Recently, the categories in commons:Category:Churches in the Roman Catholic Archdiocese of Trento have been moved to a new naming standard; each category is tied to a Wikidata element, whose Property:P373 needs to be updated following the movement (it just should match the category's new name; a few of them already do, but most don't). Is it possible to do this via bot? In case, do I need to set up something else (i.e., a list of the wikidata elements, or anything)? Thank you, --Syrio posso aiutare? 11:26, 14 March 2019 (UTC)

Licence of data to import (if relevant)
Discussion


Request process

HumansEdit

The English-language description of female (Q6581072) is "human who is female (use with P21)", for male (Q6581097) it is "human who is male (use with P21)" (emphasis added). The equivalent also appears to be true in many other languages.

Therefore any source that supports the use of one of those items as a value for sex or gender (P21) logically also supports the use of human (Q5) as the value for instance of (P31).

A bot should be used to copy references from P21 to P31, in such cases, where no existing reference for P31 is found. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:17, 14 March 2019 (UTC)

  Oppose, original research. Sjoerd de Bruin (talk) 21:40, 16 March 2019 (UTC)
There is no original research proposed. Feel free to make a case, rather than an unsupported assertion, if you believe otherwise. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:38, 17 March 2019 (UTC)
That would cause problems with fictional characters, where sex or gender (P21) is also used. Fictional human characters use fictional human (Q15632617), not human (Q5) as values for instance of (P31). And IIRC separate items for sexes/genders of humans and organisms, e.g. male (Q6581097) and male organism (Q44148), were only created because some languages have different terms for male/female persons and animals. Unfortunately, it gets a bit complicated with non-human persons in fiction. Gandalf (Q177499) for example isn't human, but using male organism (Q44148) doesn't seem a good solution since that would apparently result in him being described with terms referring to animals in some languages. Therefore, most humanoid fictional characters use male (Q6581097)/female (Q6581072) at the moment. --Kam Solusar (talk) 22:06, 16 March 2019 (UTC)
It would not, because the proposal is restricted to cases where P31=Q5. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:38, 17 March 2019 (UTC)
Ah, my bad. Seems I totally misread your proposal and my mind immediately jumped somewhere else. :-/ --Kam Solusar (talk) 23:16, 17 March 2019 (UTC)
  Support per nom. --Tagishsimon (talk) 22:20, 17 March 2019 (UTC)

The SimpsonsEdit

Request date: 17 March 2019, by: Patriccck (talkcontribslogs)

Link to discussions justifying the request

Hi, can someone please change all descriptions epizoda seriálu Simpsonovi to díl seriálu Simpsonovi? It is goal of Czech WikiProject TV. (After doing this, plese ping me)--Patriccck (talk) 12:42, 17 March 2019 (UTC)

Licence of data to import (if relevant)
Discussion
  • You could use QuickStatements to do so. --- Jura 12:49, 17 March 2019 (UTC)
    • @Jura: I dont understand it. I dont know HTML. Can you do it? --Patriccck (talk) 17:05, 17 March 2019 (UTC)
  • Nothing to do with HTML. See QuickStatements (Q20084080). Input would be
Q277882	Dcs	"díl seriálu Simpsonovi"

Note the tab in the source. A list of episodes is at User:Trade/Episodes_of_The_Simpsons. --- Jura 19:22, 17 March 2019 (UTC)

    • @Jura: I'm sorry, but I still don't understand how to do it. I don't know about this. Can someone else please do that? --Patriccck (talk) 20:54, 17 March 2019 (UTC)
Request process