Wikidata:Contact the development team
Contact the development team
Regarding the accounts of the Wikidata development team, we have decided on the following rules:
|On this page, old discussions are archived. An overview of all archives can be found at this page's archive index. The current archive is located at 2019/05.|
|Development plan||Usability and usefulness||Status updates||Development input||Contact the development team|
I would like to have my name removed from. wikidata. I did not ask for it to be put there.
Changes to "wbgetentities" or "wbgetclaims"Edit
Hi. I am developing a program which will get P31, P279 and P361 from a list of ids. However, the two available API's doesn't seem to fully accomplish my desire.
"wbgetentities" doesn't filter the results too much, like only getting the properties. For 50 items, I got a JSON of 5MB, with the least amount of stuff I could select.
"wbgetclaims" doesn't allow me to choose 3 different properties, so it looks like I would have to make 3 request for each item, or make a request without specifying the property, which would also give me also a big download in the end. Also, it doesn't seem the best to make a huge amounts of request for each item (even if it was possible to get the 3 properties at once).
"wbgetclaims" example: https://www.wikidata.org/w/api.php?action=wbgetclaims&entity=Q42&property=P31&props=
Results in way more stuff I want to get (and just one property). It would be nice if it was possible to just select to output the "id" of each occurency for each property.
Could the devs change one action of these, or create a new one, to allow the user to request a list of ids, and a list of properties?
- @SrBrahma: I think you’re asking for two different things here. Turning
propertyparameter from a single property to a list of properties should be possible and was in fact requested half a year ago already; see T206934. But I’m not sure about the other part, where you say the request results in way more data than you need. Do you mean that it should only return the plain statement values, without qualifiers/references or the details of the main snak? I don’t think that matches the rest of the current API very well.
- However, you could also use the Wikidata query service for this – it lets you get items for a set of properties, and also uses a more concise data format by default. https://w.wiki/3H$ or https://w.wiki/3J2 would be two queries that do more or less what you want, I believe. --Lucas Werkmeister (WMDE) (talk) 15:43, 24 April 2019 (UTC)
Increase Wikidata Query Service timeoutEdit
- Hello Ayack,
- Could you give us a few examples of queries that timed out recently? Lea Lacroix (WMDE) (talk) 14:16, 10 May 2019 (UTC)
- Hello Lea, this one for example. I have to run it at least 15 times before having the results instead of a timeout. Ayack (talk) 16:01, 10 May 2019 (UTC)
- Here is another pretty simple one (category items without sitelinks). It used to work most times with execution times close to 60 seconds until February, and times out practically always since then. The absolute number of category items has not changes significantly since then, it is pretty constant at 4.3M. --MisterSynergy (talk) 16:33, 10 May 2019 (UTC)
Extract semantic terms ?Edit
Hello, I'm a student and I'm doing my degree work about Wikipedia. I would like to extract particular wikipedia data with wikidata, however I am completely new and it's hardly for me to use it. I wondered if with Wikidata it was possible to extract precise semantic terms such as "wife of". I thought I might be able to have some help on your page?
- Bonjour @Sarajeunie:,
- (autant parler français entre francophone) Cela ne me semble pas être une question pour l'équipe de développement de Wikidata mais je me permets de proposer mon aide.
- Quelle est la question précisément ? Lister tout les articles de la Wikipédia (francophone j'imagine) dont le texte contient les termes « femme de ». Si c'est bien le cas, Wikidata n'est d'aucune aide. Selon le besoin de précision, il est possible soit de faire une recherche avec le moteur de recherche classique et si on veut être plus des résultats plus précis et exhaustif, il faudra alors sans doute analyser le dump XML complet (par exemple avec l'outil database scanner d'AutoWikiBrower). Voici quelques pistes que je peux approfondir et expliciter selon les besoins.
- Cdlt, VIGNERON (talk) 11:39, 13 May 2019 (UTC)
- Bonjour @VIGNERON:
- Merci beaucoup pour votre réponse et votre aide. La question serait en effet de lister tous les article de la Wikipédia francophone dont le texte contient les termes « femme de » et d'extraire des articles le nombre de « femme de ». Merci beaucoup pour vos pistes. J'ai regardé avec une recherche avec le moteur de recherche classique qui me donne le nombre de résultats, toutefois il mélange « femme de » et « épouse de », que je souhaiterai aussi extraire mais indépendamment de « femme de ». J'ai l'impression que AutoWikiBrower serait parfait, mais il ne fonctionne que sur Microsoft Windows et j'ai un mac. Y aurait-il un moyen de l'ouvrir autrement ?
- Je vous remercie beaucoup pour votre aide. --— Sarajeunie. (discuter) 15 mai 2019 à 10:00 (CET)
- @Sarajeunie:I'm not sure if the sequence is clear: Wikidata is here as it's hard to do directly in Wikipedia. Wikidata is partially built with such extractions from Wikipedia. --- Jura 10:01, 14 May 2019 (UTC)
Disconnect Wikidata item option for Special:MovePage (at Wikipedia)Edit
Users at Wikipedia should have the option to disconnect the Wikidata item of a page when moving it. In some wikis it seems acceptable that pages are moved to something only vaguely related where the sitelink on Wikidata shouldn't be updated. Unless the item is disconnected, I think it happens automatically. --- Jura 10:01, 14 May 2019 (UTC)
- This is a change that would happen on Wikipedia's interface. Since the Wikidata development team is not working on Wikipedia's codebase, I'm afraid there is not much we can do about it. Maybe this issue should be raised elsewhere, like on the community wishlist process or via a gadget on Wikipedias. Lea Lacroix (WMDE) (talk) 10:55, 15 May 2019 (UTC)
- Possibly. As it has a negative effect on data quality at Wikidata, maybe it's something as the Wikidata community, we need to solve by ourselves. Sorry for bothering you with this. --- Jura 11:51, 15 May 2019 (UTC)
Outdated constraint still reportedEdit
The value requires statement constraint (Q21510864) on item for this sense (P5137) was removed almost a month ago, but I still see a violation for it e. g. on L46038-S1. Is that a bug? —Galaktos (talk) 20:19, 14 May 2019 (UTC)
https:// prefix missing in WQS short urlEdit
In WQS, short urls are generated without the https:// prefix which have to be add by hand every time you want to share the url or save it in Wikidata. Could it be possible to add it automatically please? Thanks. Ayack (talk) 13:16, 17 May 2019 (UTC)