Wikidata:Contact the development team/Archive/2019/07

int: broken ?

At the time of this writing something like {{int:wikibase-snakview-snaktypeselector-novalue}} no value gives <wikibase-snakview-snaktypeselector-novalue>. It used to give the Mediawiki translation in the user language, and I also noticed stuffs like that for the translation of datatypes in the Property Proposal template. author  TomT0m / talk page 19:09, 11 July 2019 (UTC)

I have noticed the same problem also in page history. --Epìdosis 19:27, 11 July 2019 (UTC)
I think that this discussion is resolved and can be archived. If you disagree, don't hesitate to replace this template with your comment. Matěj Suchánek (talk) 07:10, 12 July 2019 (UTC)

No display of German Title

I just viewed some Items and the Title Wikipedia over the Sitelinks to Wikipediaarticles is displayed as ⧼wikibase-sitelinks-wikipedia⧽ and official names or names in the native language of a person are not displayed correct for German words it is displayed as ⧼wikibase-monolingualtext⧽. Can someone please look what the problem is and solve it. I can see the description if I change the language of the user interface into English, when I change it into other languages as English the same error occurs. -- Hogü-456 (talk) 19:27, 11 July 2019 (UTC)

I think that this discussion is resolved and can be archived. If you disagree, don't hesitate to replace this template with your comment. Matěj Suchánek (talk) 07:10, 12 July 2019 (UTC)

Error 403 when requesting API

Hello,

I am currently working on my Bachelor project and I need to make requests on the Wikidata API. The purpose of this project is to automatically generate questions from triples. Since yesterday I've been having troubles making requests (errors 429, though I was letting 2 seconds between requests) and today I cannot make any request (I receive errors 403 - forbidden) from home and from my workplace. Is my MAC address banned from the API? Is there anything that you (or I) could do that would make me able to send requests again?

Thank you

 – The preceding unsigned comment was added by Sheltine (talk • contribs).

  • I am getting the exact same error message for almost a week now. Is there anything we should be aware of? --- AddNPBot
Please read https://meta.wikimedia.org/wiki/User-Agent_policy Smalyshev (WMF) (talk) 17:44, 2 July 2019 (UTC)

Highest Number that can be saved

Hello,

Wikidata has in propably one month one Billion Edits. What is the Biggest Number for the database as a version that can be saved. Wikidata could reach 2 Billions Edits in two or three years and then it is near to the 32-Bit border wich is 2147483647. It is the highest number within the 32 bit range. Do you need to change something until this number of edits is reached or is there a higher number of Eits that can be saved. -- Hogü-456 (talk) 21:26, 29 June 2019 (UTC)

It's actually 4,294,967,295 because it's unsigned. --Matěj Suchánek (talk) 09:07, 30 June 2019 (UTC)
Thanks for sharing your concern. We have this on our radar but we don't need to change anything for now. Lea Lacroix (WMDE) (talk) 08:09, 2 July 2019 (UTC)

Query Service Rate Limits

I'm getting "Rate limit exceeded"-Errors on my https://query.wikidata.org queries. I just tried to read the "Retry-After"-header which I read about on this page:

https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual#Query_limits


All header I get are the following:

  • date
  • content-type
  • content-length
  • connection
  • server
  • cache-control
  • x-varnish
  • via
  • age
  • x-cache
  • x-cache-status
  • server-timing
  • strict-transport-security
  • set-cookie
  • x-analytics
  • x-client-ip

Did the name of the header change or something?

Thank you for any help in advance.  – The preceding unsigned comment was added by 2A02:8071:91AE:5E00:15F4:B8CB:F41:98A (talk • contribs).

@Smalyshev (WMF): Can you have a look at this? Lea Lacroix (WMDE) (talk) 08:08, 2 July 2019 (UTC)
Are you using descriptive user agent? Please read https://meta.wikimedia.org/wiki/User-Agent_policy Smalyshev (WMF) (talk) 17:45, 2 July 2019 (UTC)
Maybe related: tools should use user-agent to access WDQS. --Succu (talk) 19:15, 5 July 2019 (UTC)

Wrong vacuum

 

Hello.How did this void appear above the section I was editing? David (talk) 08:14, 9 July 2019 (UTC)

Hey, I had a look but I have no idea what's happening. No data was lost and everything that you wanted to write appeared, right? Lea Lacroix (WMDE) (talk) 10:07, 9 July 2019 (UTC)
Yes, but there was an extra vacuum. Is it normal to show things that we did not write? David (talk) 15:26, 9 July 2019 (UTC)
It's definitely some kind of bug. We'll have a look at it. Lea Lacroix (WMDE) (talk) 17:03, 9 July 2019 (UTC)

Remove numerical ids, but not remove it?

@Addshore: about your closing of phab:T114902: I'm not sure if that is a good idea. I think the change would have simplified SQL queries and possible future features. --- Jura 10:40, 1 July 2019 (UTC)

The original reasoning of the ticket is "we want to be able to use non-numeric entity IDs". We have basically decided that the tables don't block us from doing that at all, thus the ticket can be closed / is invalid.
I guess this is talking specifically about wb_items_per_site?
Right now switching from an int to some text there would simplify things for humans possibly, but made things for complicated / increase overhead for the machines involved.
I'm not sure if some SQL view might simplify queries? It might be worth opening a different ticket for that though.
·addshore· talk to me! 00:34, 6 July 2019 (UTC)
  • @Matěj Suchánek: what do you think? Finally, your tool somehow worked around it, but personally I gave up trying to join wikipedia page properties with the wikidata sitelink table. --- Jura 09:59, 14 July 2019 (UTC)
    I don't mind this schema. The purpose of this table is primary for sitelink uniqueness checking which needs to be as efficient as possible. And if integers (32 bits) are more efficient than strings (72+ bits), I am not going to push this. Note that this table is totally incompatible with Wikipedia tables not only wrt numerical ids but also underscores, namespaces etc. --Matěj Suchánek (talk) 10:21, 14 July 2019 (UTC)

WQS error

This query works OK in past (see Wikidata:WikiProject every politician/European Union/data/Parliament/Eighth), but now I am getting "Unexpected end of JSON input" error. Any thoughts? --Jklamo (talk) 08:43, 4 July 2019 (UTC)

@Smalyshev (WMF): Can you have a look? Thanks :) Lea Lacroix (WMDE) (talk) 13:46, 4 July 2019 (UTC)
@Jklamo: your query times out, because it has a bug. Use this one instead: https://w.wiki/5i3 Be careful with OPTIONALs, and use LIMIT unless you're sure your query does the right thing (which happens after you checked it with LIMIT :) Smalyshev (WMF) (talk) 23:12, 5 July 2019 (UTC)
Thanks, it works! Just in a case of time out, I expect timout error as result, not the mentioned one. Also I am sure that these queries worked in past.--Jklamo (talk) 16:18, 10 July 2019 (UTC)

P57 in the .json but not in the wikidata page

So I have this movie, when I lookup for P57 in it, I found nothing. https://www.wikidata.org/wiki/Q15885473 But when I go in this (The same thing but in JSON): https://www.wikidata.org/wiki/Special:EntityData/Q15885473.json In claims there is a P57 (and also a P577) in it. the P57 has no value, just "some value" in "snaktype"

And since the P57 is not in the main page, I can't remove it and thus remove it from the .json

 – The preceding unsigned comment was added by 2A01:E35:2E0B:C200:2851:145E:CF97:F483 (talk • contribs) at 2019-07-14 21:57 (UTC).

Thanks for noticing. I checked today and I don't see the P57 in the .json, it probably took a few hours to be updated. Lea Lacroix (WMDE) (talk) 06:08, 15 July 2019 (UTC)


Notifications stay red/blue

Even when clicked, notifications stay red/blue for a day or two now.

It's a bit irritating, but I suppose one could just ignore or de-activate them. --- Jura 11:28, 19 July 2019 (UTC)

Please allocate additional slots for Magnus' tools

Please arrange to allocate more connections to Magnus' tools. Apparently they are now limited to 10 concurrent connections.

Numerous Wikidata editors are relying on them and can't use them anymore.

@Magnus Manske, Tagishsimon, Epìdosis, Lymantria: --- Jura 16:16, 4 July 2019 (UTC)

Yes, please. Lymantria (talk) 05:16, 5 July 2019 (UTC)
@Smalyshev (WMF): This doesn't ring a bell to me the Wikidata team. Did yours change anything recently? Lea Lacroix (WMDE) (talk) 15:12, 5 July 2019 (UTC)
Not anything that can be related to the tools. If many people use the same tool, it probably could get over the limit since it's the same machine/agent sending repeated queries. Smalyshev (WMF) (talk) 23:02, 5 July 2019 (UTC)
  • How can we solve this? Alternatively, I suppose users could just run their own copy of the tools, possibly at a hidden location on toolserver, so none else gets to use the slots. --- Jura 08:14, 6 July 2019 (UTC)
Also PetScan returns zero results in a lot of cases since 3rd of June (see [1] and [2]) or delivers an "502 Bad Gateway" error. You have to start it several times (10-20) with the same parameters, to get eventually a list with results. --M2k~dewiki (talk) 07:09, 6 July 2019 (UTC)
I don't know. Nothing has changed recently from the development team appart from the user-agent header requirement. I don't know if Magnus made any changes on the tools' code. Lea Lacroix (WMDE) (talk) 12:38, 10 July 2019 (UTC)
Probably there was an update of the operating system (Q9135) running that tools setting back the connection limit to 10. --Succu (talk) 19:50, 11 July 2019 (UTC)
I asked @Magnus Manske: if he would know. --- Jura 09:57, 14 July 2019 (UTC)
I guess this is the number of SQL connections a tool account is allowed to have towards the (MariaDB) database servers, that's 20 BTW, see https://lists.wikimedia.org/pipermail/cloud-announce/2019-February/000138.html and https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database. Multichill (talk) 21:33, 21 July 2019 (UTC)
@BDavis_(WMF): can you increase it? --- Jura 22:44, 21 July 2019 (UTC)
@Jura1: Magnus is currently the maintainer of 132 separate tool accounts on Toolforge. Folks are going to have to get a lot more specific before anyone from the WMCS team can be of much help here. Nobody has named a specific tool or described a supporting service that is being concurrency limited. --BDavis (WMF) (talk) 01:34, 22 July 2019 (UTC)
@BDavis_(WMF): 132, that many? It's mainly https://petscan.wmflabs.org that effects Wikidata users here. --- Jura 20:08, 22 July 2019 (UTC)
@Jura1: We are getting closer to an actionable bug report. :) Do you know what external resource Petscan needs more concurrent connections to? Wiki Replicas? Wikidata Query Service? Action API? Petscan runs in its own Cloud VPS project, so scaling processor and ram should be under Magnus' control.
Has anyone tried to find some co-maintainers to help Magnus keep Petscan running smoothly? My team at the Foundation is staffed to help with issues in the underlying Cloud VPS/Toolforge infrastructure, but we do not have folks on staff to fix tools and projects directly. The closest thing the Wikimedia Foundation has today for doing that is the Community Tech team through their Wishlist projects. --BDavis (WMF) (talk) 21:23, 22 July 2019 (UTC)
This blog post by User:Magnus Manske might be of interest. He is currently re-writing Petscan, and he apparently "pools replica database access form several of my other HTML/JS-only tools (which do not use the database, but still get allocated connections)" to avoid running out of database connections. --MisterSynergy (talk) 10:13, 23 July 2019 (UTC)
@BDavis_(WMF): It used to work just fine until the number of connections got limited. Multichill identified that as being done by what you announced as https://lists.wikimedia.org/pipermail/cloud-announce/2019-February/000138.html . As it's something that is really not transparent to users like me, I can try to search further, but I'd think it's easier to do so on the developer side. That change likely has lead to PetScan no longer working. --- Jura 03:43, 24 July 2019 (UTC)

This will be a non-issue once version 2 of PetScan goes into production. --Magnus Manske (talk) 10:47, 25 July 2019 (UTC)

Wikipedia for mobile devices

Dear developers,

1st Categories

On Wikipedia on mobile devices I can not see categories as a computer view so I suggest that they make it possible to see the categories between Last edited and Related articles. I would like also add or remove categories on that way.

2nd Mobile edites

I also suggested that edited on mobile decise https://en.m.wikipedia.org/wiki/Internet_troll#/editor/0 i can press "Show oreview", "Show changes" and "Cancel" along with "Publish changes" and Categories on top of the page to see is that category exist or not when I edit or make a page on mobile devices (view).

3rd Wikidata links

Where I am on mobile devices (view), I can not add or remove wikilinks even when I use BETA version (https://en.m.wikipedia.org/wiki/Internet_troll#/languages). I'd like to you add this in BETA version or making possible on mobile view (devices)

I hope that any of this problems can bi solved or suggestion to be use (updated) soon on Wikipedia or/and your answer. With all respect. Uspjeh je ključ života (talk) 11:17, 20 July 2019 (UTC)

This is probably something for CKoerner (WMF) :) Lea Lacroix (WMDE) (talk) 18:28, 21 July 2019 (UTC)

Graphes no longer working

Hello, since few days, this template does not work when used with a Wikidata query : https://www.mediawiki.org/wiki/Template:Graph:Lines Can someone look this up ? I've lag a phabricator there https://phabricator.wikimedia.org/T226250 . Hope it will be solved… Thanks. Bouzinac (talk) 11:37, 22 June 2019 (UTC)

Same issue with Template:Graph:Pie chart used on 100+ pages here. It only works when previewing. Ayack (talk) 16:01, 24 June 2019 (UTC)
:-(Bouzinac (talk) 13:19, 27 June 2019 (UTC)

@Smalyshev (WMF): Is this something that could have been caused by a change on our side? Lea Lacroix (WMDE) (talk) 08:06, 2 July 2019 (UTC)

Same problem in Greek Wikipedia. Xaris333 (talk) 08:07, 10 July 2019 (UTC)

As per comments on phab:T226250, it's unclear who is maintaining Graphoid, which makes the issue hard to be fixed. Lea Lacroix (WMDE) (talk) 07:12, 24 July 2019 (UTC)
Where do you read that? Who is handling the server logs? --- Jura 07:23, 24 July 2019 (UTC)
On this ticket is discussed who is maintaining the Graphoid service. It was unclear but since some volunteers suggested to help, I hope there will be a positive outcome. Lea Lacroix (WMDE) (talk) 08:42, 24 July 2019 (UTC)
Very bizarre as it can in fact work, if written in another fashion eg [3]Bouzinac (talk) 11:45, 24 July 2019 (UTC)
  • @Lydia Pintscher (WMDE): Can you pick this up? It's some that is broken on Wikidata itself, a WMF production website, for more than a month. Wouldn't it be up to WMF to fix (or deactivate) it? If WMF can't, maybe WMDE-Wikidata development team can help? --- Jura 11:45, 29 July 2019 (UTC)
Hello Jura,
The Wikidata team cannot take the ownership of Graphoid, because we never worked on it so far and we already have a lot on our plate. However, we will start investigating on one of the possible issues (the user-agent header to send requests to the Query Service) to help understand where the issue comes from. Lea Lacroix (WMDE) (talk) 15:29, 29 July 2019 (UTC)

Use a property as a link source

In a custom Wikibase installation I would like to use a text content of a property to build a link to a page, but it seems I am unable to.
Let's say I have a property where the text is Some Page: so I tried to buld a link writing [[{{#statements:P1}}]] and expected the result to be Some Page, while in fact I am getting a plain text [[Some Page]].

I made several searches on the internet, but I failed to understand why this is not working and how to modify my approach to make it works instead. I assume I am making some fundamental error, but I am not sure what.
May I ask you if you can briefly explain to me the error or point me to the proper documentation to read to achieve this result?

Thanks --Lucamauri (talk) 15:39, 18 July 2019 (UTC)

I don't know if this method works, but I wouldn't do it because the returned item label may be different from the pagelink. Besides depending on the property there could be zero or more than one result. --Dipsacus fullonum (talk) 07:27, 22 July 2019 (UTC)
I can insure that the label is exactly as the page name and that at least a result is always present, so these are not issues. Beside that, can you please explain how to do this technically? if there is another way to reference a page, kindly just let me know. Thanks. --Lucamauri (talk) 08:35, 22 July 2019 (UTC)
The safe way is to use a lua module to access the pagelink so it can make links constructed like [[Page link|item label]]. In many projects the pagelink and the label will often be different due to disambiguation. The module can be called from a template and if the module isn't used for anything else, it can be made in a few lines of lua. --Dipsacus fullonum (talk) 19:15, 22 July 2019 (UTC)
PS. I made an example of a minimal module to create links from property values af https://da.wikipedia.org/w/index.php?title=Modul:Sandkasse&oldid=10008065 --Dipsacus fullonum (talk) 20:19, 22 July 2019 (UTC)
Thanks @Dipsacus fullonum:, I had a look at your example and I adapted it to my own use, very helpful.
Allow me one last question: a LUA script is the only way to address this functionality or just a convenient one? I mean, could it be done in any other way? --Lucamauri (talk) 10:40, 29 July 2019 (UTC)

Search for multiple (specific) types via wikibase:api "EntitySearch" running into 50 record limit?

I'm trying to let people search for anything, as long as it's something in the popular culture like an actor, book, movie, sports team, etc. So if I search "Patriots" I should get the NFL Team and the movie The Patriot. And I think I've gotten a good start with this:

SELECT distinct ?ordinal ?item ?itemLabel ?itemDescription ?image WHERE {
  SERVICE wikibase:mwapi {
    bd:serviceParam wikibase:api "EntitySearch";
                    wikibase:endpoint "www.wikidata.org";
                    mwapi:search "Patriots";
                    mwapi:limit 1000;
                    mwapi:language "en" .
    ?item wikibase:apiOutputItem mwapi:item .
    ?ordinal wikibase:apiOrdinal true .
  } 
  ?item wdt:P31/wdt:P279* ?type.
  OPTIONAL{?item wdt:P18 ?image .}
  FILTER( ?type in (wd:Q5, wd:Q17537576, wd:Q12973014))
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
} ORDER BY ASC (?ordinal) LIMIT 100
Try it!

I'm using the EntitySearch service so that I can get the :apiOrdinal value to help sort the results for relevance, and ?type contains the whole list of parent types so that I can just use "creative work" and "sports team" as catch-alls to filter on. (The FILTER list of Q types is incomplete, I was just starting to get the test going)

But, the problem happens when you replace "Patriots" with "Fellowship". In theory I should get the book and film Fellowship of the Ring. But the only result is for the "group" Fellowship of the Ring. If you take off the FILTER, you'll see it only returns 40 records from the EntitySearch for some reason? And if you change it to "Fellowship of the Ring" it will correctly find the book and film, so the EntitySearch can find those two. But for some reason it seems like EntitySearch is capping the results to a limit of 40?

Any ideas on what's going on?

Thanks! --Thomas.lumen (talk) 15:24, 12 July 2019 (UTC)

@Smalyshev (WMF): Any idea? Lea Lacroix (WMDE) (talk) 06:09, 15 July 2019 (UTC)
You may be seeing the effects of incomplete continuation implementation, which has been fixed in https://phabricator.wikimedia.org/T209034 but not deployed yet. Try next week after it's deployed, it may work better. If not, I'll take another look and see what's going on there. Smalyshev (WMF) (talk) 18:25, 18 July 2019 (UTC)
Thanks @Smalyshev (WMF): - Looks like the code was deployed, but if I comment out the FILTER line, and query "Fellowship" it still only returns ~45 results when there are definitely more than that. --Thomas.lumen (talk) 12:20, 24 July 2019 (UTC)
@Thomas.lumen, Smalyshev (WMF): I think MWAPI doesn’t support continuation for EntitySearch at all – in the source code, MWApiServiceCall.parseContinue() only looks at //api/continue in the response XML (with a “TODO support other options?”), but action=wbsearchentities returns a custom search-continue property. You can use the wbsearch generator instead:
SELECT DISTINCT ?ordinal ?item ?itemLabel ?itemDescription ?image WHERE {
  SERVICE wikibase:mwapi {
    bd:serviceParam wikibase:endpoint "www.wikidata.org";
                    wikibase:api "Generator";
                    mwapi:generator "wbsearch";
                    mwapi:gwbssearch "Fellowship";
                    mwapi:gwbslanguage "en";
                    mwapi:gwbslimit "max".
    ?item wikibase:apiOutputItem mwapi:title.
    ?ordinal wikibase:apiOrdinal true.
  }
  ?item wdt:P31/wdt:P279* ?type.
  OPTIONAL{ ?item wdt:P18 ?image. }
  FILTER(?type IN (wd:Q5, wd:Q17537576, wd:Q12973014))
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
ORDER BY ASC(?ordinal)
LIMIT 100
Try it!
--Lucas Werkmeister (WMDE) (talk) 16:38, 29 July 2019 (UTC)
It'd probably be good to amend MWAPI to be able to deal with custom continuations... Smalyshev (WMF) (talk) 23:12, 29 July 2019 (UTC)
@Lucas Werkmeister (WMDE), Smalyshev (WMF): So is the issue that MWAPI needs to deal with continuations? Because when I run this new version of the query with wbsearch, I still hit the 50 record cap when I remove the FILTER and so "Fellowship" doesn't pull in the movie or book in the first 50 results that come back. --Thomas.lumen (talk) 17:11, 31 July 2019 (UTC)
@Thomas.lumen: Hm, I think I didn’t think this through – it looks like wbsearch(entities), when used as a generator, doesn’t support continuation either. I’ve filed T229460 to solve this. --Lucas Werkmeister (WMDE) (talk) 18:13, 31 July 2019 (UTC)