Logo of Wikidata

Welcome to Wikidata, Bargioni!

Wikidata is a free knowledge base that you can edit! It can be read and edited by humans and machines alike and you can go to any item page now and add to this ever-growing database!

Need some help getting started? Here are some pages you can familiarize yourself with:

  • Introduction – An introduction to the project.
  • Wikidata tours – Interactive tutorials to show you how Wikidata works.
  • Community portal – The portal for community members.
  • User options – including the 'Babel' extension, to set your language preferences.
  • Contents – The main help page for editing and using the site.
  • Project chat – Discussions about the project.
  • Tools – A collection of user-developed tools to allow for easier completion of some tasks.

Please remember to sign your messages on talk pages by typing four tildes (~~~~); this will automatically insert your username and the date.

If you have any questions, please ask me on my talk page. If you want to try out editing, you can use the sandbox to try. Once again, welcome, and I hope you quickly feel comfortable here, and become an active editor for Wikidata.

Best regards! --Epìdosis 13:41, 23 April 2019 (UTC)

Cluster problematiciEdit

@Epìdosis: Ovviamente ci stiamo portando dentro gli errori del Viaf stesso. Individuarli (temo solo a occhio) aiuterà il Viaf a correggerli. Grazie a te per l'appoggio. Prossimi import non prima di venerdì. --Bargioni (talk) 21:30, 12 November 2019 (UTC)

Ottimo lavoro con il VIAF!--Alexmar983 (talk) 02:35, 15 November 2019 (UTC)

@Alexmar983: Grazie. Ma siamo solo all'inizio, cioè a un decimo del lavoro. Oggi proseguo con gli import.
Per via del processo di clusterizzazione del VIAF, e soprattutto per il prelevamento che fa da Wikidata, questo allineamento andrà ripetuto. Dovrei scrivere da qualche parte come si fa. Qual è il posto più adatto? --Bargioni (talk) 09:18, 15 November 2019 (UTC)
Direi qualche progetto tematico sugli identificativi, non specifico di un paese. User:Epìdosis oggi è il primo giorno di WSC2019 in centinaia di paesi, lo indichi te quello migliore?--Alexmar983 (talk) 11:28, 15 November 2019 (UTC)
Ci sono varie alternative: Wikidata talk:Identifiers in realtà servirebbe per discussioni generali sugli identificativi, quindi non è adatta; Wikidata talk:WikiProject Properties è troppo generico; Wikidata talk:WikiProject Biographical Identifiers è troppo specifico (VIAF non è solo biografico, anzi!); Wikidata:Project chat è di gran lunga troppo generico; in conclusione, credo che la scelta migliore sia Property talk:P214. --Epìdosis 13:51, 15 November 2019 (UTC)
Sì ma vista la crucialità dell'indetificativo farei un rimando dal progetto sugli ID se non compare nessuno dopo qualche giorno.--Alexmar983 (talk) 17:45, 15 November 2019 (UTC)
Vero. La cosa migliore è scrivere nella talk dell'identificativo e poi segnalare altrove. --Epìdosis 17:49, 15 November 2019 (UTC)

VIAF confusionEdit

Are you actually checking VIAF records before adding them? I already reverted your addition of VIAF 110615942 to Q1173887 and I said why in the edit summary. That VIAF record is linked to a mixture of records for the person described by Q1173887 and a similarly named person Q73178408. This situation is not uncommon. It is bad enough that people add these without checking them, but when it's been reverted by someone who takes the time to explain the problem, and the same editor just puts it back, this is even worse. All we are doing is pointlessly reproducing VIAF's errors. Iepuri (talk) 11:38, 16 November 2019 (UTC)

@Iepuri: Hi! Thank you for your observation. At the moment Bargioni is just importing all the cases in which VIAF cluster links to Wikidata, as I suggested him here; the next step will be to find problematic cases and listing them, in order to send a report to VIAF. The old en:Wikipedia:VIAF/errors is no longer used (but probably contains a lot of still useful reports), because VIAF prefers to receive reports at bibchange@oclc.org; I think, however, it would be useful to create a new page for reports here on Wikidata (the title may be Wikidata:WikiProject Authority control/VIAF errors) in order to have a collective list of problematic clusters - this page would also be useful in order to progressively check if the reports contained in the old en.wikipedia page have been addressed or not. What's your opinion? Bye, --Epìdosis 12:11, 16 November 2019 (UTC)
@Iepuri: Thank you for your correction(s). I know that while adding a lot of VIAFids, I'm importing (a very small number of) VIAF errors too. Their clusterization process may create inconveniences, that can be resolved only by humans, in my opinion. --Bargioni (talk) 15:02, 16 November 2019 (UTC)
And Q34351913, Monastero di San Vittore (Milan, Italy) (Q30063100), Q30134887. --Yiyi .... (talk!) 20:21, 18 November 2019 (UTC)
I'm trying to follow the data import in order to clean potential erros, but we should define a structured way to provide feedback for such errors to VIAF and their sources. Just for example, viaf/246353202 and viaf/313420111 should refer to the same church Q24034979. The error comes from two different GND items 3D7719397-0 and 3D1065019327 that should be merged. On the contrary, in VIAF they are kept disjoint and have different links to Wikidata items for the curch and the village where the church is located. Other example is a church named after San Nicola that in VIAF merges sources related to very different places. How to manage such process? Pietro (talk) 17:53, 20 November 2019 (UTC)

VIAFEdit

Please ab bit more attention. Do you really belive, the hungarian handball player would be the same as the italian physician? -- Marcus Cyron (talk) 17:06, 21 November 2019 (UTC)

@Marcus Cyron: Hi, modification of Gabriella Landi (Q57981021) was applied by a batch job that is part of a large project: adding about 570,000 VIAF ID (P214) to Wikidata items, using data from Virtual International Authority File (Q54919). This means that unfortunately we also import some wrong associations introduced by VIAF itself. So, sorry. And, please, feel free to modify, delete or deprecate this new value. --Bargioni (talk) 17:52, 21 November 2019 (UTC)
Because I yesterday reverted around 90% of the additions in the field of badminton people (not clubs) I suggest not to proceed in this project, or revert all batches and the users can re-add the few correct entries. People looking at Wikidata as a very correct database, now we putting a lot of wrong things into it, making incorrect databases to seem more reliable. I did not check other professions than the badminton related ones, but if it looks there similar, some action must be done. Probably also too much entries are not on watchlists, so the wrong data will remain here long and be used for other databases as incorrect data input (@Marcus Cyron:) Another solution would be that you, Bargioni, by yourself would check every line you added for independent sources, and revert every unsourced entries by yourself. Florentyna (talk) 06:15, 22 November 2019 (UTC)
Amen. The solution to import false data willingly is strange and in my opinion false. -- Marcus Cyron (talk) 07:02, 22 November 2019 (UTC)
@Florentyna, Marcus Cyron: In my opinion we should first do the synchronisation and then check the data: I checked about one hundred added identifiers in the field of ancient Greek and Roman authors and they are nearly always correct, so the correctness probably depends on the field. Simply ignoring the VIAF is not the right way: we should import the data, then collect in one page wrong data and report them to VIAF, in order to have them corrected. --Epìdosis 08:55, 22 November 2019 (UTC)
@Florentyna: If I'm not wrong, badminton players with a VIAF ID (P214) can be selected using
select ?Q ?viaf where { ?Q wdt:P214 ?viaf ; wdt:P106 wd:Q13141064 . }
(result is 225). We can thus delete VIAF ids from items of badminton players in a simple way. Of course, if a badminton player is also an author, we would like to preserve his/her VIAF ID (P214).
As @Epìdosis: noticed, it is also interesting to gather VIAF errors imported by my batches: we could help the clusterization process of VIAF, that is as hard as any other reconciliation process. --Bargioni (talk) 10:38, 22 November 2019 (UTC)
Exact, I was going to suggest that query. I've just created Wikidata:WikiProject Authority control/VIAF errors, where all errors should be listed in order to help VIAF in the clusterization. The import will finish in a few hours, then we will start with some tidy-up: I've already two ideas for removing wrong clusters ;-) --Epìdosis 10:43, 22 November 2019 (UTC)
"We can thus delete VIAF ids from items of badminton players in a simple way" If you do that, you will also be deleting some good data. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:33, 6 December 2019 (UTC)
I'm still cleaning up bad matches from this batch. Example: [1]. What are you doing to reduce this burden on me and other volunteers? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:32, 6 December 2019 (UTC)
@Pigsonthewing: Problematic cases are being listed Wikidata:WikiProject Authority control/VIAF errors; when you find errors, please add them there in order to report them to VIAF. --Epìdosis 11:04, 6 December 2019 (UTC)
We tried that on Wikipedia: en:Wikipedia:VIAF/errors#Status of this page. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:03, 6 December 2019 (UTC)

VIAF confusion (2)Edit

Hi! I have reversed your VIAF link after checking that it does not refer to the same cave, even though it has the same name. Although both caves are located in the Region of Murcia, the cave referred in Q8352480 is located in Cartagena, while the one referring to your link is in Pliego. Greetings, P4K1T0 (talk) 10:56, 22 November 2019 (UTC).


It seems to me that adding these VIAF statements in this manner (automatically or semi-automatically) is generally a bad idea. I've checked some identiferes added to places in Estonia and very often the problem is that VIAF entry is rather vague or mixes different entities, so that it's rather impossible to tell which Wikidata item it should actually match. E.g. this VIAF entry is currently associated to settlement, but names like "Tapa Region" suggest that rather it matches something else. This something else may be rural municipaly, but I don't really know as VIAF entry provides pretty much no clear context. Or, this VIAF entity is currently associted to settlement. Some alternative names however suggest that it may match parish instead. Or, for example there are three settlements entitled "Kurtna" in Estonia and it's unclear which of these is the VIAF entry about and why it's associated to this Kurtna. 2001:7D0:81F7:B580:F1CB:428E:AC85:CC4A 11:00, 22 November 2019 (UTC)

Batch too coarseEdit

I think your huge VIAF batch was too coarse. Please see here. Asaf Bartov (talk) 13:07, 22 November 2019 (UTC)

I answered in the apposite page. --Epìdosis 13:42, 22 November 2019 (UTC)

Sincronizzazione conclusa - nuove sfideEdit

Carissimo, finalmente i 23 batch sono conclusi! Ho segnalato l'avvenuta sincronizzazione in Property talk:P214#Recent synchronisation, invitando soprattutto ad usare la nuova pagina Wikidata:WikiProject Authority control/VIAF errors per segnalare i cluster confusi. Ti lascio qui giusto qualche idea che avrei per raffinare ulteriormente le nostre connessioni col VIAF:

  1. eliminare tutte le VIAF ID (P214) che linkano a cluster contenenti solo ID "undifferentiated" e/o "sparse" (non so se sia possibile individuarle facilmente dai dump) - probabilmente non sono molte, però sono senza dubbio da rimuovere;
  2. usare i fields per fare liste di possibili identificativi errati: es. listare gli elementi privi di instance of (P31) human (Q5) aventi nella VIAF ID (P214) un cluster del field "Personal Names", oppure gli elementi con sport (P641) aventi nella VIAF ID (P214) un cluster del tipo "Geographic Names" ... e cose simili; possiamo ragionare su quali liste cercare di ottenere (anche in questo caso, non so se sia possibile individuare facilmente dai dump il field al quale i cluster appartengono);
  3. cominciare ad esaminare gli elementi con instance of (P31) human (Q5) aventi più di una VIAF ID (P214), una delle quali è stata aggiunta in questo import; ho già preparato una lista approssimativa contenente circa 16mila elementi, ma penso che si possa filtrare ulteriormente, ci devo riflettere sopra. Quest'ultimo punto, comunque, riguarda un controllo manuale, mentre il primo è automatico e il secondo può essere pressoché automatico.

Insomma, ora comincia il bello: la caccia agli errori! Al tempo stesso è importante far sì che il VIAF noti al più presto la pagina Wikidata:WikiProject Authority control/VIAF errors, in modo che cominci a prendere in carico quelle segnalazioni (e le altre che verranno aggiunte nei prossimi giorni, settimane e mesi). Intanto, comunque, ancora complimenti per l'ottimo lavoro svolto! Mi dispiace che ti abbia attirato alcune critiche, secondo me immeritate (è ovvio che in ogni import del genere ci sia una certa percentuale di errori, ma la soluzione non può che essere un controllo manuale ed eventualmente semi-automatico a posteriori, come quello che ora metteremo in atto, e non bocciare l'import e far finta di niente), ma sono certo che la gran parte della comunità apprezzerà molto questo grande lavoro. Grazie ancora e a risentirci nei prossimi giorni! Buona notte :) --Epìdosis 00:05, 23 November 2019 (UTC)

@Epìdosis: Grazie a te per lo spunto iniziale, per i controlli durante i batch e per il supporto al contrasto delle critiche. In effetti adesso inizia una parte complessa, che oltre a quanto dici giustamente, include -direi- il seguimento dei merge e dei nuovi link tra Q e VIAF. Presumo che ci sia da sfruttare le differenze tra i file "http://viaf.org/viaf/data/viaf-AAAAMMGG-persist-rdf.xml.gz" e i file "http://viaf.org/viaf/data/viaf-AAAAMMGG-links.txt.gz" mese per mese. In questo modo si dovrebbe evitare di ripetere operazioni già fatte e stt di riportare nuovamente i VIAF errati. E per quanto dici sopra, mi pare che tu stia prospettando un'analisi che, in mancanza di un metodo di ricerca sui dati VIAF, richiede i dati di Wikidata e quelli dei cluster VIAF sulla stessa macchina. Se sbaglio, dimmelo. Se dico bene, è una bella impresa :-) , vista la quantità dei cluster e quindi la necessità di un sistema molto performante. Ci ragiono. --Bargioni (talk) 11:20, 23 November 2019 (UTC)
@Epìdosis: I casi di cluster costruiti solo su sorgenti sparse o solo undifferentiated possono essere recuperati dal dump del file http://viaf.org/viaf/data/viaf-20191104-clusters-rdf.xml.gz, con un filtro apposito, che controlli anche la presenza della sorgente WKP (Wikidata):

curl -s 'http://viaf.org/viaf/data/viaf-20191104-clusters-rdf.xml.gz' | gzip -cd | grep -E 'sparse|undifferentiated' | filtro
Il filtro è da fare, ma non è una promessa :-) Il comando sarà cmq molto lento, vista la dimensione del file .gz (6 GB?)
Analogamente si può costruire un elenco di  VIAF | tipo | Q  che permetta di lavorare i tipi 2. Ma sarebbe di molte righe, per cui non saprei come usarlo in una sparql. --Bargioni (talk) 10:58, 25 November 2019 (UTC)

Buon anno!Edit

Ciao! Volevo passare a ringraziarti per il tuo ultimo passaggio di aggiunta di Pontificia Università della Santa Croce ID (P5739) e ad augurarti buon 2020! Sono certo che riusciremo a continuare nel migliore dei modi il lavoro che abbiamo cominciato nelle scorse settimane. A presto, --Epìdosis 13:11, 31 December 2019 (UTC) P.S. Ho appena trovato un redirect VIAF, se e quando hai tempo potremmo valutare un nuovo passaggio di pulizia di VIAF ID (P214) reindirizzati o cancellati

@Epìdosis: Meno male, non mi avresti trovato, sono in ferie fino al 6. Ma mi avrebbe fatto molto piacere, quindi s'ha da fare.
Ho avuto materiale utile dal library manager della biblioteca relativa a Angelicum ID (P5731), così ho lanciato l'ultimo batch.
Per quanto riguarda un costante intervento su VIAF ID (P214) in base ai dump mensili del progetto VIAF, vale certamente la pena stabilizzare le procedure. Dovremmo anche ragionare sulla eccessiva differenza tra numero di cluster VIAF e numero di item Wikidata con VIAF ID (P214). Cioè, direi che in Wikidata mancano moltissimi autori. Buon anno. --Bargioni (talk) 13:42, 31 December 2019 (UTC)
Ciao, volevo dirti che ho notato ora che l'aggiornamento del VIAF (sistemazione dei cluster reindirizzati ed eliminazione dei cluster cancellati), nonché di ISNI e NLA, è compiuto periodicamente dal KrBot, quindi non credo sia necessario un nostro passaggio per questo aspetto; l'ultimo passaggio è stato il 10 gennaio. Ci risentiamo nei prossimi giorni. Buona domenica, --Epìdosis 12:06, 12 January 2020 (UTC)
@Epìdosis: Grazie, info utilissima. Possiamo lavorare su LCNAF e stt GND: ho trovato una via. Ma forse va discussa con altri, vedi Property_talk:P227#(careful)_import_from_VIAF?. --Bargioni (talk) 14:26, 12 January 2020 (UTC)

Incorrect VIAF on Q60527326Edit

Good morning, I have reverted your VIAF changes on Office of Inspector General, Export-Import Bank of the United States (Q60527326) because that VIAF record refers to the parent organization. William Graham (talk) 14:49, 17 January 2020 (UTC)

@William Graham: Thanks a lot. Errors like this come from VIAF, unfortunately. We have to instruct it/them to foster their clusters. So, please, add the item Office of Inspector General, Export-Import Bank of the United States (Q60527326) to the page Wikidata:WikiProject Authority control/VIAF errors. --Bargioni (talk) 16:00, 17 January 2020 (UTC)
@Bargioni: I can do that. However, I want to note that even though you are using an external data source and semi-automated tooling, you are still responsible for the accuracy of the data you add to Wikidata and have a responsibility to manually vet your changes and/or discontinue using your tools if you are knowingly inserting incorrect data. Thank you and have a nice day. William Graham (talk) 16:14, 17 January 2020 (UTC)
@William Graham: The edit was correct, I restored it: https://viaf.org/viaf/154029415/ refers to Office of Inspector General, Export-Import Bank of the United States (Q60527326) and https://viaf.org/viaf/130175936/ refers to Export-Import Bank of the United States (Q1384697). Bye, --Epìdosis 17:07, 17 January 2020 (UTC)
@William Graham: Sorry for annoying you once again... my firs reply is wrong. Thx to Epìdosis, I can say that my batch job didn't generate any error. As you know, VIAF project uses clusters. In VIAF, Q60527326 was associated with cluster https://viaf.org/viaf/288007377, and now it is part of cluster https://viaf.org/viaf/154029415/. So, if you click on the current (let's say, yours) P214 in Q60527326, you will be redirected by VIAF to the cluster identified by the second value. This means, in my opinion, that what you dislike is independent by my batch job, since both values now link to the same VIAF cluster. --Bargioni (talk) 17:16, 17 January 2020 (UTC)
@Bargioni: Sorry for my confusion, need my coffee before editing in the morning. :) William Graham (talk) 17:17, 17 January 2020 (UTC)
@William Graham: Me too :-) Do not add Q60527326 to the page Wikidata:WikiProject Authority control/VIAF errors. Bye! --Bargioni (talk) 21:55, 17 January 2020 (UTC)

VIAF - errori geograficiEdit

VIAF comuni italianiEdit

Ciao! Prima di assentarmi per l'esame della settimana prossima, volevo dirti una cosa che ho notato oggi: consultando Wikidata:Database reports/Constraint violations/P214, mi sono accorto che il nostro import di novembre ha significativamente aumentato (come prevedibile, ma forse più del prevedibile) le violazioni dei vincoli, in particolare alzando non solo i "single value" da 12mila a 32mila (ma lì si può benissimo spiegare, nella maggior parte dei casi con errori del VIAF), bensì anche gli "unique value" da mille a 17mila (e questi dovrebbero essere errori nostri, un po' tanti!). Bene, ho cominciato a vedere la lista degli unique value per cercare di capire un po' quali errori ci potessero essere e ... ho trovato la fonte di quasi 4mila errori: i comuni italiani.

Su Wikidata ogni comune italiano (salvo forse quelli di recentissima creazione) ha due elementi:

  • uno per il comune in sé (con tutti gli identificativi e tutte le pagine di Wikipedia: es. Albenga (Q241298));
  • uno per il suo capoluogo (inteso come la parte del comune omonima al comune stesso, ma distinta dalle altre frazioni comprese nel comune medesimo; esistono perché Wikipedia in cebuano ha creato via bot queste pagine basandosi su GeoNames, quindi tendenzialmente contengono soltanto cebwiki e GeoNames ID (P1566): es. Albenga (Q30022077)).

Ora, per oscure ragioni pare che il VIAF in 3824 casi abbia messo nei suoi cluster (chiarissimamente riferiti ai comuni nella loro interezza) gli inutili elementi Wikidata dei capoluoghi, e poi il nostro import ha aggiungo VIAF ID (P214) in questi elementi, creando quindi una violazione di "unique value" rispetto agli elementi dei comuni.

Ecco la lista dei casi:

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q747074 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P31 wd:Q15303838 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],it,en". }
}

Try it!

Di conseguenza, sarebbe bene procedere in questo modo:

  1. inserire la lista dei 3824 casi in Wikidata:WikiProject Authority control/VIAF errors, a mo' di tabella (come la sezione "VIAF with sparse or undifferentiated records"), con tre colonne: link al VIAF, elemento di Wikidata scorretto ed elemento di Wikidata corretto
  2. rimuovere con QuickStatements la VIAF ID (P214) da tutti i 3824 capoluoghi di comune

Se vuoi io posso occuparmene dopo il 29, altrimenti puoi procedere tu stesso, dovrebbe essere un lavoro piuttosto veloce; in tal modo cominciamo a decongestionare la lista delle violazioni di vincolo. Potrebbero stare un po' più attenti laggiù al VIAF, però ... Grazie mille come sempre, ci sentiamo a fine mese! --Epìdosis 17:34, 20 January 2020 (UTC)

@Epìdosis: Molto interessante... detto anche un po' ironicamente. Grazie della segnalazione. Sto studiando la cosa. --Bargioni (talk) 16:57, 21 January 2020 (UTC)
  Done finalmente problema risolto, il VIAF ha sostituito nei cluster i link ai capoluoghi coi link ai comuni. --Epìdosis 20:52, 9 May 2020 (UTC)

VIAF - comuni olandesiEdit

Altri 55 casi scovati nei Paesi Bassi, il principio è esattamente lo stesso, direi (comune vs capoluogo):

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q2039348 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P131 ?comune .
  ?capoluogo wdt:P17 wd:Q55 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],it,en". }
}

Try it!

Continuo la ricerca. --Epìdosis 20:49, 29 January 2020 (UTC)

VIAF - comuni belgiEdit

Altri 124 in Belgio:

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q493522 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P131 ?comune .
  ?capoluogo wdt:P17 wd:Q31 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],it,en". }
}

Try it!

--Epìdosis 20:56, 29 January 2020 (UTC)

VIAF - comuni spagnoliEdit

Altri 110 in Spagna:

SELECT ?comune ?comuneLabel ?capoluogo ?capoluogoLabel ?viaf
WHERE {
  ?comune wdt:P31 wd:Q2074737 . 
  ?comune wdt:P214 ?viaf .
  ?capoluogo wdt:P31 wd:Q15303838 .
  ?capoluogo wdt:P17 wd:Q29 .
  ?capoluogo wdt:P214 ?viaf .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],es,ceb". }
}

Try it!

--Epìdosis 21:22, 29 January 2020 (UTC)

Removed invalid VIAFs might be re-imported from other wikis, as long as they still exist thereEdit

Hello Bargioni,

you recently removed a lot (abount 4.000 ?) of VIAF-IDs, which became invalid over time. (example). One problem is, that a lot of them have been imported from other wikis, mostly the German version (example). So the invalid IDs might still exist in the wikis they have been imported/harvested from (e.g. using HarvestTemplates). Sooner or later they might be reimported from these other wikis into Wikidata, as long they are not removed also in these other wikis.

Do you know a way to mass remove these invalid IDs also in the wikis they have been imported from? --M2k~dewiki (talk) 00:22, 23 January 2020 (UTC)

@M2k~dewiki: Hi, interesting question... Please confirm me that an example of the issue you refer to is this one: the now invalid VIAF 10512949 deleted by my batch from Wolfgang Leidig (Q1718929), is still present in https://de.wikipedia.org/wiki/Wolfgang_Leidig. And you are asking for a procedure to remove it also from the de.wiki page. Please, also ping @Epìdosis: in your reply. Thx a lot. --Bargioni (talk) 13:55, 23 January 2020 (UTC)
Yes, @M2k~dewiki: refers to cases such as Wolfgang Leidig, and he is right, the obsolete IDs still present in some Wikipedias (mainly de.wiki, because most Wikipedias read identifiers only from Wikidata and don't have them in their pages) can than be re-imported on Wikidata. The problem has already been noted here and I think there is (or there would be) consensus for the removal of obsolete IDs.
My suggestion is: first of all @M2k~dewiki: opens a thread in de:Hilfe Diskussion:Normdaten asking for consensus about the correction of obsolete VIAF IDs (which includes deletion of obsolete IDs and substitution of redirected IDs) and if someone would be capable of programming a bot doing this job periodically; if, when consensus about the correction is reached, a bot-programmer has been found, all OK; otherwise, I will ask for help to the Italian bot-programmers I know. Do you agree with this plan? --Epìdosis 15:41, 23 January 2020 (UTC)
Hello @Epìdosis, Bargioni:, as suggested I started a discussion at de:Hilfe_Diskussion:Normdaten#Ungültige_VIAF-Kennungen_in_Artikeln. Thanks a lot! --M2k~dewiki (talk)

Cluster VIAF con due o più elementi WikidataEdit

Ciao! Oggi, dopo aver notato (con un certo disappunto) questo cluster, ho pensato che una certa fetta, non so quanto ampia, delle nostre violazioni di unique constraint sia dovuta al fatto che alcuni cluster VIAF mettono assieme due o più elementi di Wikidata. Riusciresti a fare una tabella sulla base dell'ultimo dump e a incollarla in Wikidata:WikiProject Authority control/VIAF errors/Two or more Wikidata items, così poi la controlliamo gradualmente? In futuro possiamo rimuovere direttamente le righe dove loro hanno ragione (cioè quando ci accorgiamo che Wikidata ha effettivamente dei duplicati, e li uniamo) oppure constatare che non hanno ragione e aggiungere un commento (quindi puoi lasciare una cella vuota sulla destra per ogni riga). Grazie mille come sempre e buona serata, --Epìdosis 19:06, 10 February 2020 (UTC)

@Epìdosis: Situazione non rosea...: ci sono 4489 cluster nel VIAF che contengono elementi Wikidata diversi. E non sono solo coppie. Ecco i casi top:
occ VIAF cluster
6 http://viaf.org/viaf/308723015
6 http://viaf.org/viaf/242677520
6 http://viaf.org/viaf/122624553
select ?viaf ?q ?qLabel ?ins ?insLabel where {
  ?q wdt:P214 ?viaf ;
     wdt:P31 ?ins .
  values ?viaf {"308723015" "242677520" "22624553"}
  service wikibase:label { bd:serviceParam wikibase:language "it,en,de,fr,es,pt". }
}
order by ?viaf

Try it!

Domani spero di fare la tabella nella nuova pagina. --Bargioni (talk) 22:13, 10 February 2020 (UTC)
Accidenti! Ci sarà lavoro da fare ... let's do it, slowly! A domani :) --Epìdosis 22:16, 10 February 2020 (UTC)
@Epìdosis: E' nata Wikidata:WikiProject Authority control/VIAF errors/Two or more Wikidata items. Se la modifichi e fai Anteprima, potrebbe non rispondere mai. A me è successo. Temo abbia troppi link da costruire. --Bargioni (talk) 15:08, 11 February 2020 (UTC)
I arrived here because i saw this and this wrong edition. Are there a problem with the VIAF, right? --Vanbasten 23 (talk) 20:52, 31 March 2020 (UTC)
@Vanbasten 23: VIAF has errors. My huge batch import (Nov 2019) (un)fortunately included them. Please, append errors you may find to Wikidata:WikiProject_Authority_control/VIAF_errors. -- Bargioni 🗣 21:11, 31 March 2020 (UTC)
Bargioni, I recently created all the libraries in Spain and I would like to put them the VIAF code, how can I do it? Thanks. --Vanbasten 23 (talk) 15:15, 1 April 2020 (UTC)

Viaf Alvesta kommunEdit

The entry was wrong that Viaf belong to the town Alvesta.Yger (talk) 04:58, 11 February 2020 (UTC)

Spanish LibrariesEdit

Hi, @Vanbasten 23:, I prefer to open a new thread. Please let me know -even in Spanish, or in Italian, if you like- what do you mean about created all the libraries in Spain? New items? If so, please pass me some examples. Thx, sorry. -- Bargioni 🗣 15:28, 1 April 2020 (UTC)

Perfect, and thanks. Yes, new items. This is the query. There, you can see only a few Viaf id... I introduced the address, telephone number, email, instance of, descriptions, label, coordenates... and i'm working on the image, commons... but i have not the identifiers... Thanks. --Vanbasten 23 (talk) 18:02, 1 April 2020 (UTC)
@Vanbasten 23: Hi! I've seen the message, so I try to leave just a little comment, as far as I know the topic. First of all, thank you for the great work you have done!!! Regarding the addition of VIAF ID (P214), I think it is probably not a priority: the great majority of libraries doesn't have a VIAF code, because no national library (including Biblioteca Nacional de España ID (P950)) usually has an identifier for them, with the exception of the biggest ones, which are covered because they have often published something or they have been the subject of some publication. Probably only some tens of libraries can receive a VIAF. I would look at ISIL (P791) instead, which is the most important identifier for libraries all over the world:
SELECT DISTINCT ?biblio ?biblioLabel ?isil ?viaf
WHERE {
  ?biblio (wdt:P31/(wdt:P279*)) wd:Q7075;
    wdt:P17 wd:Q29.
  OPTIONAL { ?biblio wdt:P791 ?isil. }
  OPTIONAL { ?biblio wdt:P214 ?viaf. }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],es". }
}

Try it!

ISIL seems nearly always absent, but certainly hundreds (I guess) of libraries have one (I'm sure there are more libraries with ISIL than libraries with VIAF, at least). Unfortunately, not being a librarian, I don't know at the moment where to find some list of ISIL codes. Maybe you can ask a question in the talk page of Wikidata:WikiProject Libraries. @Bargioni:, of course correct me if I've said something wrong :) --Epìdosis 19:39, 1 April 2020 (UTC) P.S. Little advertisement to Italian public :)
Thank you so much. I asked Bartioni about the Viaf id because I saw that he had several batches about it, but yes, my idea is to introduce those identifiers that can contain libraries, like isil. I'll look, thank you very much. --Vanbasten 23 (talk) 20:17, 1 April 2020 (UTC)
@Vanbasten 23: I agree with @Epìdosis:: VIAF is for personal, corporate, meeting authors, places, ... but not libraries. The MARC org code is an ISIL compliant id that can be assigned to a library upon request. At the moment, 399 Spanish libraries have an ISIL code: https://www.loc.gov/marc/organizations/org-search.php?countryID=185&submit=Search. You could report it in P791. Does a national Spanish code for libraries exist? It could also be added to your items. Sincerely.  – The preceding unsigned comment was added by Bargioni (talk • contribs).
@Vanbasten 23: Very good, I hoped there was a list of ISIL but I didn't know where :) Great! --Epìdosis 08:31, 2 April 2020 (UTC)
Yes, i have an id, this is actually the id assigned by the it:Instituto Nacional de Estadística (Spagna) to identify the libraries. I was thinking of asking for a new property... --Vanbasten 23 (talk) 14:28, 2 April 2020 (UTC)
About de ISIL code, perfect and thanks. I will put it ;) --Vanbasten 23 (talk) 14:32, 2 April 2020 (UTC)

Q145973Edit

Author IDs should be added to author data items, not to work data items. --EncycloPetey (talk) 14:28, 21 April 2020 (UTC)

@EncycloPetey:   Done, just created Pseudo-Anacreon (Q91332057). Thanks, --Epìdosis 14:50, 21 April 2020 (UTC)
@EncycloPetey, Epìdosis: Thx. Incoming Perseus author ID (P7041) values could reflect errors from the Perseus catalog... A total of 813 links to Wikidata are contained in its records. -- Bargioni 🗣 15:32, 21 April 2020 (UTC)

Creation of duplicatesEdit

Hi, your QuickStatements seem to create duplicates:

-- Discostu (talk) 12:00, 4 May 2020 (UTC)

@Discostu: Thx a lot. I'll check it. -- Bargioni 🗣 13:41, 4 May 2020 (UTC)
This was reported at Topic:Vkhd578n4cv9ndew. Luckily the duplicates are easy to find and merge.--GZWDer (talk) 13:46, 4 May 2020 (UTC)
@GZWDer: My QS commands do not contain more than one CREATE for items that were duplicated, like Krzysztof Lala (Q93244134)... What's wrong? -- Bargioni 🗣 13:59, 4 May 2020 (UTC)
Backend QuickStatements run tasks in several threads, but there is not a lock mechanism, so multiple threads did the same task. You can use frontend QuickStatements which have no such problem. (This problem seems happen on item creation only.)--GZWDer (talk) 14:04, 4 May 2020 (UTC)
Thx for your explanation. Happy to know that duplicates are independent on my QS commands :-) -- Bargioni 🗣 14:35, 4 May 2020 (UTC)
Also see https://phabricator.wikimedia.org/T234162 --M2k~dewiki (talk) 18:57, 24 May 2020 (UTC)

CheckEdit

Non sono del tutto certo sull'unione tra Ángel González Muñiz (Q715118) e Ángel González Muñiz (Q80325986). Puoi indagare? Grazie mille, --Epìdosis 15:27, 6 May 2020 (UTC)

@Epìdosis, Gentile64: Giro la richiesta. -- Bargioni 🗣 07:06, 7 May 2020 (UTC).
è la stessa persona. Se cliccate sull'isni vedrete che sono presenti due viaf. Comparandoli si capisce che l'autore è lo stesso. Purtroppo sono presenti molte volte due o più cluster viaf per indicare lo stesso autore. Un saluto  – The preceding unsigned comment was added by Gentile64 (talk • contribs).
@Gentile64:   Done, uniti. --Epìdosis 08:15, 7 May 2020 (UTC) P.S. Per firmare i messaggi, usa il tasto apposito, come spiegato qui :)

Quickstatements: 1589290271422Edit

Questions from your QS load batch: https://tools.wmflabs.org/editgroups/b/QSv2T/1589290271422/

I was wondering if the load above included all the properties as noted on the screen in one single action to create 4946 new entities. And if you could share a sample of a few entities from the set to compare how the file can be prepared. Thank you for your help.

jshieh (talk) 19:30, 14 May 2020 (UTC)

@ShiehJ, Epìdosis: With the strong collaboration of Epìdosis, we projected to create items from unmatched entries of the MnM FAST catalog https://tools.wmflabs.org/mix-n-match/#/catalog/150. Then we filtered entries with VIAFid and well formed dates, and enriched them accessing both FAST and VIAF records using http://fast.oclc.org/fast/$fast_id/marc21.xml and http://viaf.org/viaf/$viafid/viaf.json. We used a Perl script, whose output are the CREATE LAST LAST commands for QuickStatements (including references). We plan to add about 59,000 items. Here is an example of QS commands. Please let me know if you are interested in more info. -- Bargioni 🗣 07:31, 15 May 2020 (UTC)
CREATE
LAST    Len     "Alexander Murray"
LAST    P31     Q5
LAST    P569    +1727-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P570    +1793-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P214    "51558010"      S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P244    "nr92041797"    S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508922"
LAST    P7859   "lccn-nr92041797"       S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "51558010"
LAST    P2163   "1508922"
CREATE
LAST    Len     "Joseph Dawson Murray"
LAST    P31     Q5
LAST    P569    +1785-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P570    +1852-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P214    "29401466"      S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P244    "nr92041800"    S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508924"
LAST    P7859   "lccn-nr92041800"       S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "29401466"
LAST    P2163   "1508924"
CREATE
LAST    Len     "John Barton Derby"
LAST    P31     Q5
LAST    P569    +1792-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P570    +1867-00-00T00:00:00Z/9 S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P214    "12174577"      S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P244    "nr92040978"    S248    Q3294867        S813    +2020-05-07T00:00:00Z/11        S2163   "1508928"
LAST    P213    "0000 0000 4805 0059"   S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "12174577"
LAST    P7859   "lccn-nr92040978"       S248    Q54919  S813    +2020-05-07T00:00:00Z/11        S214    "12174577"
LAST    P2163   "1508928"
@Bargioni, Epìdosis: This is a much more clear examples than what the Help:QuickStatements doc provides. It would be ideal that your examples be added to the Help doc for some novice like me! THANK YOU!

What you described is precisely what we will try to accomplish, uploading ca. 43K artists names which are not found in Wikidata as we grab content from the VIAF json file to upload. I am experimenting extracting necessary data from VIAF JSON URL then parse them into columns via OpenRefine. However the result is rather disappointing at the moment. It is likely due to my lack of proficiency in OpenRefine and JSON. The VIAF MARCxml format was not considered since the workflow was to be conducted by colleagues using browser. Otherwise, combining MARC::Perl module with marcxml to extract data to feed via Quickstatements should be simpler.

This artists set will be over 60K names. Afterwards, the following sets will be over 600K names (person and corporate and family). The method we use will first match Wikidata to ensure the entry to be added is indeed a new item in Wikidata. The method we devise at this time is browser based operations. Hoping this will have broader appeal to colleagues who are catalogers and reference librarians, to engage in the Wikidata activities. — jshieh (talk)

Query per duplicati con ru.wikiEdit

# ?item1 is the imported one
# ?item2 is human and sitelink to ruwiki
# ?item1 and ?item2 are born the same date
SELECT ?item1 ?label_en ?birthyear ?deathyear ?item2 ?label_ru
WITH
{
  SELECT ?item1 ?FAST_ID ?label_en
  WHERE
  {
    VALUES ?item1 { wd:??? } .
    #?item1 wdt:P31 wd:Q5.
    #?item1 p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] .
    ?item1 rdfs:label ?label_en.
    FILTER (LANG(?label_en) = "en")
  }
  #LIMIT 20
} AS %get_humans_with_FAST_ID
WHERE
{
  INCLUDE %get_humans_with_FAST_ID
  ?item1 wdt:P569 ?birth.
  ?item2 wdt:P569 ?birth.
  BIND(str(YEAR(?birth)) AS ?birthyear)
  ?item1 wdt:P570 ?death.
  ?item2 wdt:P570 ?death.
  BIND(str(YEAR(?death)) AS ?deathyear)
  FILTER (?item1 != ?item2)
  ?item2 wdt:P31 wd:Q5.
  ?ruwiki_sitelink schema:about ?item2 .
  ?ruwiki_sitelink schema:isPartOf <https://ru.wikipedia.org/>.
  { ?item2 wdt:P27 wd:Q159. } UNION { ?item2 wdt:P27 wd:Q15180 . } UNION { ?item2 wdt:P27 wd:Q34266 . }
  ?item2 rdfs:label ?label_ru.
  FILTER (LANG(?label_ru) = "ru")
}
ORDER BY ?label_ru

Try it!

Lascio come promemoria :) --Epìdosis 10:33, 15 May 2020 (UTC)

Questa individua gli elementi con "vich" nell'etichetta:

SELECT ?item ?en_label
WHERE {
  ?item p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] ;
        rdfs:label ?en_label .
  FILTER(LANG(?en_label) = "en") .
  FILTER(CONTAINS(?en_label,"vich"))
}
ORDER BY ?en_label

Try it! --Epìdosis 10:44, 15 May 2020 (UTC)

Query finaliEdit

Ne consegue che

# ?item1 is the imported one
# ?item2 is human and sitelink to ruwiki
# ?item1 and ?item2 are born the same date
SELECT DISTINCT ?item1 ?label_en ?birthyear ?deathyear ?item2 ?label_ru
WITH
{
  SELECT ?item1 ?FAST_ID ?label_en
  WHERE
  {
    #VALUES ?item1 { wd:??? } .
    ?item1 p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] .
    ?item1 rdfs:label ?label_en .
    FILTER(LANG(?label_en) = "en")
    FILTER(CONTAINS(?label_en,"vich"))
  }
  #LIMIT 20
} AS %get_humans_with_FAST_ID
WHERE
{
  INCLUDE %get_humans_with_FAST_ID
  ?item1 wdt:P569 ?birth.
  ?item2 wdt:P569 ?birth.
  BIND(str(YEAR(?birth)) AS ?birthyear)
  ?item1 wdt:P570 ?death.
  ?item2 wdt:P570 ?death.
  BIND(str(YEAR(?death)) AS ?deathyear)
  FILTER (?item1 != ?item2)
  ?item2 wdt:P31 wd:Q5.
  ?ruwiki_sitelink schema:about ?item2 .
  ?ruwiki_sitelink schema:isPartOf <https://ru.wikipedia.org/>.
  { ?item2 wdt:P27 wd:Q159. } UNION { ?item2 wdt:P27 wd:Q15180 . } UNION { ?item2 wdt:P27 wd:Q34266 . }
  ?item2 rdfs:label ?label_ru.
  FILTER (LANG(?label_ru) = "ru")
}
ORDER BY ?label_en ?label_ru

Try it! è la soluzione perfetta. Funziona ottimamente, becca tutti gli elementi che hanno sia data di nascita sia data di morte; per quelli con sola data di nascita, segue

# ?item1 is the imported one
# ?item2 is human and sitelink to ruwiki
# ?item1 and ?item2 are born the same date
SELECT DISTINCT ?item1 ?label_en ?birthyear ?deathyear ?item2 ?label_ru
WITH
{
  SELECT ?item1 ?FAST_ID ?label_en
  WHERE
  {
    #VALUES ?item1 { wd:??? } .
    ?item1 p:P214 [ps:P214 ?viaf ; prov:wasDerivedFrom [pr:P248 wd:Q3294867] ] .
    ?item1 rdfs:label ?label_en .
    FILTER(LANG(?label_en) = "en")
    FILTER(CONTAINS(?label_en,"vich"))
  }
  #LIMIT 20
} AS %get_humans_with_FAST_ID
WHERE
{
  INCLUDE %get_humans_with_FAST_ID
  ?item1 wdt:P569 ?birth.
  ?item2 wdt:P569 ?birth.
  BIND(str(YEAR(?birth)) AS ?birthyear)
  ?item1 wdt:P570 ?death.
  ?item2 wdt:P570 ?death.
  BIND(str(YEAR(?death)) AS ?deathyear)
  FILTER (?item1 != ?item2)
  ?item2 wdt:P31 wd:Q5.
  ?ruwiki_sitelink schema:about ?item2 .
  ?ruwiki_sitelink schema:isPartOf <https://ru.wikipedia.org/>.
  { ?item2 wdt:P27 wd:Q159. } UNION { ?item2 wdt:P27 wd:Q15180 . } UNION { ?item2 wdt:P27 wd:Q34266 . }
  ?item2 rdfs:label ?label_ru.
  FILTER (LANG(?label_ru) = "ru")
}
ORDER BY ?label_en ?label_ru

Try it! , che ovviamente ha molti più risultati e non credo sia ulteriormente possibile filtrare. Ottimo! --Epìdosis 10:52, 15 May 2020 (UTC)

Same datesEdit

came up on Wikidata:Database_reports/identical_birth_and_death_dates/1. You created the later recently. There is also Quentin Debray (Q3414170). There seems to be some mixup. --- Jura 09:23, 20 May 2020 (UTC)

GND saturation cleanupEdit

Hi Bargioni,

Would you comment at Property_talk:P227#GND_saturation_of_Wikidata. While I trust Epìdosis, it would obviously be better if that came from you. --- Jura 10:27, 25 May 2020 (UTC)

Mappa di RomaEdit

#Biblioteche, musei e chiese a Roma
#defaultView:Map
SELECT ?luogo ?luogoLabel ?coordinate
WHERE {
  ?luogo wdt:P131 wd:Q220 .
  OPTIONAL { ?luogo wdt:P625 ?coordinate . }
  { ?luogo wdt:P31/wdt:P279* wd:Q7075 . }
  UNION
  { ?luogo wdt:P31/wdt:P279* wd:Q33506 . }
  UNION
  { ?luogo wdt:P31 wd:Q16970 . }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "it,en". }
}

Try it!

--Epìdosis 10:18, 3 June 2020 (UTC)

Con colori diversi, grazie a @Dipsacus fullonum:
#Biblioteche, musei e chiese a Roma
#defaultView:Map
SELECT DISTINCT ?luogo ?luogoLabel ?coordinate ?layer
WHERE {
   BIND(wd:Q7075 AS ?biblioteca).
   BIND(wd:Q33506 AS ?museo).
   BIND(wd:Q16970 AS ?chiesa).
  ?luogo wdt:P131 wd:Q220 .
  OPTIONAL { ?luogo wdt:P625 ?coordinate . }
  { ?luogo wdt:P31/wdt:P279* ?biblioteca . BIND(1 AS ?layer) }
  UNION
  { ?luogo wdt:P31/wdt:P279* ?museo . BIND(2 AS ?layer) }
  UNION
  { ?luogo wdt:P31/wdt:P279* ?chiesa . BIND(3 AS ?layer) }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "it,en". }
}

Try it!

--Epìdosis 10:13, 4 June 2020 (UTC)
Versione semplificata, sempre di @Dipsacus fullonum:
#Biblioteche, musei e chiese a Roma
#defaultView:Map
SELECT DISTINCT ?luogo ?luogoLabel ?coordinate ?layer
WHERE {
  VALUES ?layer { wd:Q7075 wd:Q33506 wd:Q16970 } # biblioteca, museo e chiesa
  ?luogo wdt:P131 wd:Q220 .
  OPTIONAL { ?luogo wdt:P625 ?coordinate . }
  ?luogo wdt:P31/wdt:P279* ?layer.
  SERVICE wikibase:label { bd:serviceParam wikibase:language "it,en". }
}

Try it!

--Epìdosis 10:25, 4 June 2020 (UTC)

Fix data teoricamente importata da VIAFEdit

Nella query seguente e in altri casi simili

SELECT ?p ?db
WHERE
{
  SERVICE wikibase:mwapi
  {
    bd:serviceParam wikibase:endpoint "www.wikidata.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "exturlusage" .
    bd:serviceParam mwapi:geuprop "title" .
    bd:serviceParam mwapi:geunamespace "0" .
    bd:serviceParam mwapi:geuprotocol "https" .
    bd:serviceParam mwapi:geuquery "viaf.org/viaf/" .
    bd:serviceParam mwapi:geulimit "max" .
    ?p wikibase:apiOutputItem mwapi:title .
  }
  hint:Prior hint:runFirst "true".
  
  ?p p:P569 [ps:P569 ?db ; prov:wasDerivedFrom [pr:P854 ?site] ].
  FILTER("1950-00-00"^^xsd:dateTime = ?db)
  FILTER(CONTAINS(STR(?site),"viaf.org/viaf/"))
}

Try it!

ci sono cose da sistemare. Domani ti spiego meglio. Buona notte, --Epìdosis 21:25, 5 June 2020 (UTC)

That would be great. It's the same as Wikidata:Bot_requests#Cleanup_VIAF_dates. --- Jura 21:35, 5 June 2020 (UTC)
@Jura1: Wow, so it's an ancient problem; I've noticed it only today, I think. How about removing them all, in your opinion? --Epìdosis 21:46, 5 June 2020 (UTC)
Given the time it's there, I'm glad about any approach that solves it.
I suppose re-reading the source to check "ns1:dateType" (see explanation here) for all dates would be too much work.
An intermediate solution could be to change the property of these statements to floruit (P1317) (if no other reference is given). --- Jura 08:47, 6 June 2020 (UTC)

More complete list of cases (note: it comprehends not only references to VIAF URLs, so has certainly some false positives):

SELECT ?p ?db
WHERE
{
  SERVICE wikibase:mwapi
  {
    bd:serviceParam wikibase:endpoint "www.wikidata.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "exturlusage" .
    bd:serviceParam mwapi:geuprop "title" .
    bd:serviceParam mwapi:geunamespace "0" .
    bd:serviceParam mwapi:geuprotocol "https" .
    bd:serviceParam mwapi:geuquery "viaf.org/viaf/" .
    bd:serviceParam mwapi:geulimit "max" .
    ?p wikibase:apiOutputItem mwapi:title .
  }
  hint:Prior hint:runFirst "true".
  
  ?p p:P569 [psv:P569 ?dbv ; prov:wasDerivedFrom [pr:P854 ?site] ].
  FILTER CONTAINS(STR(?site),"viaf.org/viaf/")
  ?dbv wikibase:timeValue ?db; wikibase:timePrecision ?precision.
  BIND (YEAR(?db) AS ?year)
  FILTER(?precision = 9)
  FILTER IF(?year > 0,
            ?year - FLOOR(?year / 100) * 100 = 50, # year is AD
            ?year - FLOOR(?year / 100) * 100 = 51) # year is BC, 1 BC is encoded as "0", 2 BC as "-1" etc.
}

Try it! --Epìdosis 16:17, 8 June 2020 (UTC)

NSZL VIAF senza NSZL veroEdit

SELECT ?item
WHERE {
  ?item wdt:P951 ?nszlviaf
  MINUS { ?item wdt:P3133 ?nszl . }
}

Try it! --Epìdosis 17:39, 8 June 2020 (UTC)

Wrong VIAFEdit

The VIAF-ID added here described a completely different person. I added the correct one, but it is often hard to find mistakes like this. --Christian140 (talk) 18:51, 8 June 2020 (UTC)

@Christian140: Thx. This item was updated by a huge import I made 8 months ago. Some VIAF errors were imported too, of course. And this will allow the WD community to help VIAF to correct them. So please add this error to Wikidata:VIAF/cluster/conflating_entities. --  Bargioni 🗣 11:37, 9 June 2020 (UTC)

CleanupEdit

Hi Bargioni,

Epidosis gave an estimate of two weeks for the cleanup. Can you give us an update at W:AN? What's your view about the create if needed vs. check/complete approach? --- Jura 08:46, 29 June 2020 (UTC)

Wrong VIAFEdit

Hi,

the VIAF added here is the wrong instance, it describes a administrative district and not the river Toss. --Hannes Röst (talk) 14:15, 29 June 2020 (UTC)

@Hannes Röst: Hi, this item was updated by a huge import I made in Nov 2018 2019. Some VIAF errors were imported too, of course. And this will allow the WD community to help VIAF to correct them. So please add this error (and more, if any) to Wikidata:VIAF/cluster/conflating_entities. -- Bargioni 🗣 16:13, 29 June 2020 (UTC)
I think November 2019, time passes but not so quickly ;-) --Epìdosis 19:10, 29 June 2020 (UTC)
Thx, @Epìdosis:, a typo! I corrected it. -- Bargioni 🗣 20:38, 29 June 2020 (UTC)