https://www.wikidata.org/w/index.php?title=Q9359273&diff=prev&oldid=2193016066
DeltaBot
Joined 31 July 2016
Another wrong merge: Urmensch (Q15852236), definition: Der frühere Mensch = Wikimedia disambiguation page.
The bot reacts to sitelink modifications by other users. If those are wrong, the bot makes incorrect edits. Both situations are resolved now.
Adding "topic's main template = Template:School Districts in Maryland" to school district in the United States (Q15726209)
See https://www.wikidata.org/w/index.php?title=Q15726209&action=history
Q20325506#P1423 is the reason.
Hi.
First of all, many thanks for the patient and useful work.
Could you please avoid creating new elements for en:Wikipedia pages whose title ends with "(disambiguation)" or fr:Wikipedia pages ending with "homonymie"? Just an example: Q126023680 (Cayton) created with Q29597956 already existing. Adding the Wikipedia link to the proper Wikidata element would avoid manual merges.
I hope my request is clear and this is not too much work to update.
One more request: could you please add an accent to the word "Wikimedia" in new_disambiguation_pages.py? Wikimédia is the French translation.
Line 65:
{
'language': 'fr',
'site': 'frwiki',
'project': 'wikipedia',
'category': 'Homonymie',
'description': 'page d\'homonymie de Wikimedia',
}
Hi.
Is it too complex or are my requests unclear?
It was simply forgotten, I am sorry.
The French translation is fixed, thank you for the input.
As for the other request… From experience, it is often better to import the sitelink/item anyways even if there is a possiblity that it results in a duplicated item that needs to be merged, rather than to leave it alone and hope that someone or some other bot picks it up. I would prefer to keep it as is.
I noticed that a lot of the same people end up nominating duplicate items for deletion. I know this bot leaves a comment on the deletion request closing it and directing the nominator to the merge help page, but I think very few people are actually reading them because the deletion requests are archived very soon after. Would it be possible to make the task also leave a message on the nominator's talk page? Limiting it to only leaving a message once per user as not to inundate them with multiple messages.
Thank you for the input.
I think this should be done by involved users, rather than by a bot. Usually these such hints trigger follow-up questions that a bot cannot answer, and I have no capacities to do this by myself.
Thus, I would recommend to approach users by yourself if you deem this to be necessary. It would probably be much more helpful.
Thanks for taking the time to reply. I'll definitely think about the personal approach.
I notice that DeltaBot has recently made changes to DrugBank ID, such as [https://www.wikidata.org/w/index.php?title=Q126500299&curid=120522144&diff=2176920362&oldid=2176535167 here], that break the external link. In this example, the working link to https://go.drugbank.com/drugs/DB06592 has become a broken link to https://go.drugbank.com/drugs/06592. Can someone please have a look? Thank you.
Seems the format of the identifier has recently been changed. There is a fixClaims job defined on User:DeltaBot/fixClaims/jobs which needs to be adapted as well if this is a persistent change. Identifier format changes are usually bad practice, though, but I am not sure what the background in this situation is.
I have recently updated all the DrugBank IDs to match the correct format using QuickStatements, but today I've noticed that your bot reverted all my edits. Correct format should include "DB" prefix and User:DeltaBot/fixClaims/jobs have to be updated to reflect this.
I have removed the job completely.
I had carefully corrected (Xavier Stouff (Q3570797) as Jules Tannery and Paul Appell are NOT his PhD advisors and I explained in detail the situation on the discussion page (I also gave serious references, which math Genealogy is not). DeltaBot has put their names back as PhD advisors. I have now corrected the pages of Tannery and Appell (as far as Stouff is concerned, in fact, the other names are probably also false) and thus I will again correct Stouff's page. Best
Hi, Deltabot starts updating the "Humans with missing claims" pages and then hangs on one page. Now P1340 was the last, [last week it was P2190.
While I'm here, I'll ask: could the update be scheduled for Friday? At the end of the week, there is more time to repair the items.
Thanks if you do
Will look into this later this week.
Seems there is some read timeout when the bot interacts with the Mediawiki API (via pywikibot), which apparently ultimately results in an edit conflict and crashes the bot … (?!)
No idea what is going on here to be honest, and I think I might have seen these read timeouts in some of my other bots as well. Needs further investigation for sure.
Anyways, I have rescheduled it to be run on Fridays at 15:50 UTC instead of Sundays at the same time.
Thanks for rescheduling, we'll see if it runs on Friday.
Hello MisterSynergy! Now stopped at P1006.
Can it do this again?
I mean in the subsequent categories.
The corresponding job is still running. In order to add these claim, the inverse claim does already need to be there.
And no bot can add the inverse claim based on the sibling items?
Update process of Wikidata:Property proposal/count seems wrong, continues to remove Transportation count.
The transportation section seems empty, thus it is omitted. Is this really a problem?
"Transportation (invalid parameter Transportation)" appears at Wikidata:Property proposal as a result of the Transportation count being removed. Same error also appears in the Current property proposals message at the Watchlist.
Okay I see. It is now changed to include empty sections.
Thanks.
The job got stuck for some reason. I killed it, so it can restart regularly this night.
The bot is dead again, no update for Wikidata:Database reports/Complex constraint violations/P8616 since 2024-03-06.
Thanks for the report.
The job is being properly started daily, but it gets killed when working on P31 because it runs out of memory. Looking at Wikidata:Database reports/Complex constraint violations/P31, there are probably some complex constraint definitions to tidy on Property talk:P31, and I think the bot should also be more resilient against failures due to large reports.
Not an easy fix, but I hope I can fix this soon.
I added limit of result to all contraint in P31 https://www.wikidata.org/w/index.php?title=Property_talk:P31&diff=prev&oldid=2120651490 Hope it will help.
Yes this helps, thank you. I also reduced the limit of results per section on these report pages from 5000 to 2000 which is still a lot. Several other reports got shorter as well which hopefully makes this more robust in general.
Probably a cleaning to do in nature de l’élément (P31) a lot of the complex pcontraint don't seem to work at all.
Yes, although this does not cause the bot to fail. If you are interested to work on this, the most likely reasons for failures are listed for all queries on the report page.