Wikidata:Contact the development team/Archive/2021/04

This page is an archive. Please do not modify it. Use the current page, even to continue an old discussion.

Search is very intolerant of mis-spellings

I'm sure there may be an open ticket on this already, but it does seem to me that search is very very intolerant of mis-spellings.

For example, search for "p:constraint" and there are lots of hits for properties relating to constraints. Mis-spell it even only very slightly, eg search "p:constriant" and there is nothing.

For something like looking for properties this is just an annoyance. But where this can get really damaging is when people are looking for items -- because if they don't find what they're searching for (because we have a slight variant spelling) then they may start a new item -- and then we get into the nightmare of there being a duplicate on the system, and all the avoidable manual work that that leads to, having to try to find and chase down such items and fix them.

It would be really good to know that there was a plan for looking into this. Jheald (talk) 11:17, 24 February 2021 (UTC)

@Jheald: There's no ticket directly related to this yet. We'll run it with the WMF's Search Platform team (they are responsible for the various search features) in the coming week and let you know what plans they may have for looking into this. -Mohammed Sadat (WMDE) (talk) 18:59, 25 February 2021 (UTC)
@Jheald:, at the moment search on Wikidata does not have any spelling correction activated, and we don't have plans to enable it at this point. It is also unclear which search you are talking about. TitleSearch (as exposed in the on page search box, top right corner) does not support spelling correction (and it would probably not make much sense in this context). AdvancedSearch could be configured with some kind of spelling correction / fuzzy search, but that would require some amount of analysis to decide what kind of approach would make sense. Feel free to open a phab ticket (tag it with "discovery-search"). --GLederrey (WMF) (talk) 14:32, 3 March 2021 (UTC)
@GLederrey (WMF): Thanks for getting back to me. Sorry that I didn't see your reply until now.
Why is it that you believe that spelling correction in TitleSearch "would probably not make much sense in this context" ?
As a user, am I not just as likely to make a typo typing into TitleSearch as anywhere else?
In fact fuzzy search is more important on Wikidata than Wikipedias, because at least on Wikipedia even if a (correctly-spelled) set of words do not appear in the title of an article, there is a good chance that they may appear in the body text. In contrast on Wikidata there is no body text (or very little), so search on Wikidata needs to be able to accommodate more fuzziness, to be satisfactory. (Which it currently isn't). Jheald (talk) 09:35, 17 March 2021 (UTC)
+1 I often have difficulty with this as I search lexemes and items in Swedish, English and I create duplicate items almost everyday because I simply cannot find what I'm looking for. Not seldom I give up on discovery search and search on duckduckgo instead, because its just way more likely to find the item through that and Wikipedia. A fuzzy suggester that looks in all words in Wikipedia and suggest a few relevant items based on that would be a great improvement. We already have the data, lets use it!--So9q (talk) 04:34, 5 April 2021 (UTC)

More than tests ... wikis!

Dear developers  ,
Test wikis also have a home page and other meta pages (rights, deletion, etc.). It's still not possible to link them to a WD Item. Example: [1] is not linked to any element. Warmest regards. —Eihel (talk) 12:07, 26 March 2021 (UTC)

Indeed, test wikis are not linked to Wikidata. This happens for various reasons, both technical (the wikis are not stored in the same clusters) and content/community: test wikis have a lot of test/fake content that randomly evolves. We don't think that it is relevant for readers to see these pages appear as wikilinks, and we are not sure that the Wikidata community wants to store information about test wikis on Wikidata. I hope this helps. Lea Lacroix (WMDE) (talk) 11:03, 29 March 2021 (UTC)
Rather, I think the majority of readers are primarily interested in "content" sitelinks and less in the "small Other sites" frame (meta, wikidata, mediawiki, etc.). But if there are also technical reason… —Eihel (talk) 14:51, 3 April 2021 (UTC)

QS restrictions removed?

Users are editing with QS while maxlag > 5s, causing the well-known maxlag waves, with availability of maxlag < 5s about half the time and max edit rates > 200/min. --SCIdude (talk) 16:59, 14 March 2021 (UTC)

Please see Help:QuickStatements#Best_practices. It has to do with me and Gnoeee both being administrator. Administrators have noratelimit, which sometimes ends up in QS going haywire in terms of editing speed. --Wiki13 (talk) 17:23, 14 March 2021 (UTC)
That is only a minor aspect. I don't care if you abuse admin rights. Fact is that everyone is now editing while maxlag > 5s. Consequently I no longer feel obliged to do refrain my bot. --SCIdude (talk) 17:37, 14 March 2021 (UTC)
@SCIdude: Not my experience. Have you any evidence? --Tagishsimon (talk) 17:56, 14 March 2021 (UTC)
And, though it should not need saying, if there is a problem with QS, that needs to be fixed. The solution is not for you to selfishly declare yourself above the rule and make the lag problem even worse. That's instant block territory. --Tagishsimon (talk) 18:04, 14 March 2021 (UTC)
@Tagishsimon: What do you mean, not my experience? You can see Grafana between 15.00 and 17.30 UTC. In several maxlag > 5s phases I looked at Recent changes and QS was editing happily. The 200/min edits have stopped now. --SCIdude (talk) 18:11, 14 March 2021 (UTC)
I mean that I make 1000s of QS batch edits, and my experience is that those edits respect the maxlag. QS will continue to edit when maxlag >5, but will reduce its rate. You possibbly misapprehend maxlag rules, idk. And, remember, each QS user is editing under their own account name. You are possibly mistaking the cumulative QS volume of many QS users as if that were the rate being achieved by a single user. --Tagishsimon (talk) 18:30, 14 March 2021 (UTC)

So, is the maxlag > 5s editing by the whole QS due to one admin editing? All the more reason to keep this at minimum. --SCIdude (talk) 18:20, 14 March 2021 (UTC)

Indeed. Seems sketchy. But it has always been quis custodiet ipsos custodes? --Tagishsimon (talk) 18:35, 14 March 2021 (UTC)
QuickStatements usually continues to edit, but at a slower rate. I think it targets at "manual editing speeds" of the order of 10/min, but this goes wrong sometimes due to the complexity of the tool. This should avoid that users have the impression that their batches got stuck—which is not the case for bot operation. QS has not been changed in the recent past, as far as I am aware. —MisterSynergy (talk) 19:02, 14 March 2021 (UTC)
If that target is 10/min, can you explain the max user edit rate of 50/min during maxlag > 5s? I checked that there was no bot other than QS having that edit rate. --SCIdude (talk) 08:04, 15 March 2021 (UTC)
Really, something must have changed. Today there is a maxlag > 5s phase from 06.15 to 07.00 UTC and the current phase also already lasts 40 min. My bot does nothing, QS is happily editing. This is unprecedented. --SCIdude (talk) 08:13, 15 March 2021 (UTC)
Yes. You are looking at a cumulative count of edits - the edits of all of the quickstatement users at any time. Throttling is applied to the individual user. So worked example: maxlag=0, 4 users, each 90 edits per min = 360 edits per min show up on grafana. maxlag=5, 4 users, each now throttled to, say, 45 edits per min = 180 edits per min show up on grafana. It works exactly as wikibase-cli or pywiki or your bot: each instance of use is throttled. But if you sum all the users of a tool together, you come to the wrong conclusion which you're flogging here. Your assumption seems to be that all QS users should be treated as a single user and that the cumulative edit count for QS should drop to, say, 45 rather than 180 when maxlag=5. That is not how it works and it is not how is should work. You need to find examples of a user account making disproportionate QS edits during a maxlag>=5, not merely continue misreading grafana data. --Tagishsimon (talk) 09:12, 15 March 2021 (UTC)
You don't read what I say. I was talking about max user edit rate not cumulative. And you border on ad hominem. Ignored. --SCIdude (talk) 09:28, 15 March 2021 (UTC)
la la la can't hear you does not actually work. You need to drop the mindset that your instance of your use of your bot should get as big a slice of the pie as all QS users taken together. Go back and read what I said again. It respomds exactly to your question "If that target is 10/min, can you explain the max user edit rate of 50/min during maxlag > 5s?". I accept that being told you are labouring under a misunderstanding can seem harsh, but there we are: you are labouring under a miseunderstanding. --Tagishsimon (talk) 09:33, 15 March 2021 (UTC)
Yes, sometimes individual users edit via QuickStatements way too quickly at times with maxlag>5. This is not new, it is a flaw in the tool, and the reason why I wrote "this goes wrong sometimes due to the complexity of the tool" above. The tool user does not need to worry about maxlag or monitor it, so there is little we can do other than urging Magnus to fix his tool. —MisterSynergy (talk) 10:57, 15 March 2021 (UTC)
@MisterSynergy:Is there an issue for this somewhere? I edit via Wikidataintegrator which also respects maxlag, in WBI the edits pause and I'm informed that it will pause editing for x seconds and try again. It once took 30-60s for an edit to get through because of maxlag. What I'm seeing above seems like evidence that QS does not respect the x-retry-after headers and keep editing despite getting warnings about maxlag. This is really not rocket science. QS sgould always run with maxlag=5 and respect all x-retry-after headers just like WBI. How hard can it be to inform the users in the interface when the editing has been temporarily paused because of maxlag?

See https://m.mediawiki.org/wiki/Manual:Maxlag_parameter, https://phabricator.wikimedia.org/T240442

According to https://m.mediawiki.org/wiki/API:Etiquette tools like those I write in WBI that are interactive should not respect maxlag because the user is waiting for the edit to complete to continue. Is that correct also for wikidata?--So9q (talk) 04:46, 5 April 2021 (UTC)
The QuickStatements tool is being developed by User:Magnus Manske. You need to get in contact with him regarding any issues (which is a bit difficult, unfortunately, as he seems to be flooded with requests sometimes).
WBI seems to be a bot framework by itself. Everything which is batched editing via this tool should respect maxlag. If you make individual edits through this tool that require your interaction for some reason, you can ignore maxlag—as long as you do not go quicker than ~10/min. —MisterSynergy (talk) 12:00, 5 April 2021 (UTC)
@MisterSynergy: Ok, then I will use maxlag=1 or something because LexUse is pretty fast, your can rather easily add 12 in a minute. I think we should document this on a page like Wikidata:Maxlag and make it a policy. WDYT?.--So9q (talk) 13:30, 5 April 2021 (UTC)
Go with maxlag=5 and you are fine. IMO there is no need for policy changes. —MisterSynergy (talk) 18:26, 5 April 2021 (UTC)
@So9q: See also this previous discussion and this bug report. Bovlb (talk) 18:44, 5 April 2021 (UTC)

Matter for the Listeria Code generated for SPARQL queries

Let me consider this query (https://w.wiki/3ACr). In this query, there are two variables. The output will be a table including people and their countries. However, when generating the Listeria Code for this query in order to embed the output table in a Wikipedia page, I discovered that the columns are label:Article,description,p131:Place,P580,P582,p625,P18 regardless the output of the query. However, the columns should be ?person, ?country. I propose to the developers of the Wikidata Query Service Code Generator to substitute the columns in the Listeria Code for the queries by what comes after the first SELECT module. When what comes after SELECT module is an asterisk (*), the main WHERE clause can be analyzed to find the variables for the query. --Csisc (talk) 22:44, 5 April 2021 (UTC)

This all seems a little confused to me. First, Listeria is not a WMDE product, but rather a tool developed and maintained by a user. Second, users of Listeria can within very broad margins, decide what columns to include in the table. Third, by and large, Listeria requires only a single column named ?item to be returned by the SPARQL query, and will fetch values for labels/properties indicated in the columns= parameter without the labels/properties being included in the select. You have, for me, failed to demonstrate that you understand that which you're seeking to have changed, and failed to specify any problem whether or not worth fixing. --Tagishsimon (talk) 23:25, 5 April 2021 (UTC)
@Csisc: the code generated by the query service UI is only supposed to be a starting point for developers – I think it’s generally expected that users will have to adjust it a bit. The default |columns= are meant to be a potentially useful default, but you’re free to change them. I personally don’t think that making the generated Listeria wikitext take more of the query structure into account would be worth the development effort (so far all the code samples are generated from fairly simple templates). --Lucas Werkmeister (WMDE) (talk) 11:03, 6 April 2021 (UTC)
Tagishsimon: Listeria is more than a tool to provide structured information about a set of Wikidata items. It can reproduce the results of a SPARQL query in a Wikipedia page. Any table generated with SPARQL can theoretically be embedded to Wikipedia using Listeria. It is true that a variable called item is required. But, the table can involve other variables too. These have to be featured in a Wikidata list. --Csisc (talk) 11:56, 6 April 2021 (UTC)
Lucas Werkmeister (WMDE): The only difference is that the other codes (Python, Java, Javascript...) work even for complex queries. This is not the situation for Listeria. What I propose is a minor edit that copies the variables after the first SELECT into |columns=. You can keep the default value when an asterisk (*) follows SELECT clause. This does not require much coding. You just have to add the variables after the default value. --Csisc (talk) 11:56, 6 April 2021 (UTC)
Sigh. Other than that it is still not a WMDE matter, and so this is not the right forum, copying the variables in the select into the columns= parameter does not improve the tool, and does make the SPARQL more complex: user would have to add a series of OPTIONAL {?item ?pred ?value} clauses for each variable, something that Listeria does for free. --Tagishsimon (talk) 12:33, 6 April 2021 (UTC)
@Tagishsimon: since this is about the listeria code samples generated by the Wikidata Query UI, which is maintained by WMDE, I do think this is the right place for the discussion. --Lucas Werkmeister (WMDE) (talk) 13:56, 6 April 2021 (UTC)
I stand corrected. Apologies, Csisc. --Tagishsimon (talk) 14:18, 6 April 2021 (UTC)

Some templates need the fallback for language variants

Like the translation of Wikidata:Contact the development team/Header is not visible to the user while the "International" setting as like "zh-hans-cn" in the Preferences, the {{int:lang}} are zh-cn, but the translate only allows 'zh' in this site. I also see other templates to give like: please translate to Chinese (China).--YFdyh000 (talk) 19:15, 23 March 2021 (UTC)

@Amire80, Jon Harald Søby: I'm not sure I fully understand the problem, could you have a look? Thanks! Lea Lacroix (WMDE) (talk) 11:00, 29 March 2021 (UTC)
@Lea Lacroix (WMDE): The issue two-fold: One, they're saying that translating things (with Special:Translate) into Chinese varieties doesn't work – if you try to translate the mentioned page into any language, you get the following message: [2] "至简体中文的翻译已禁用:Translate in zh please.". Whether that is by design or community consensus, I don't know. Two, even though the page has a Chinese translation, that is not shown when you have a Chinese variety selected as your language. Try either of these, compared to zh. Because of fallbacks, they should all have at least displayed the existing Chinese translation instead of English. Jon Harald Søby (talk) 15:09, 29 March 2021 (UTC)
Kaganer might have some tips on what's the best approach here. Jon Harald Søby (talk) 15:15, 29 March 2021 (UTC)
IMHO, there is needs to change technology for internationalization. I.e. through LUA module as commons:Module:Information, with using mw.language.getFallbacksFor(userLang) Kaganer (talk) 18:19, 29 March 2021 (UTC)
@Nikerabbit: Might know which way is correct on resolving this issue. --Liuxinyu970226 (talk) 00:11, 11 April 2021 (UTC)
It seems this question is similar to https://www.mediawiki.org/wiki/Topic:W6lregpcmu785fze. The simple non-answer is that {{int:lang}} is a community creation which does not support language fallbacks, and that is something for the community to handle. Templates translated with the Translate extension can, since a few months ago, be transcluded using normal transclusion syntax, and that handles language fallbacks. The difference is that the language is determined by the content language of the page, not user language. These two ways could be integrated if MediaWiki had a way to mark a page to be of varying languages (following the user interface language), in which case our current support would just work out of the box. Possibly the only concrete thing here would be to discuss whether to lift the Wikimedia configuration setting that disables translation to Chinese variants in order to support cases like this. I would be very cautious about that. --Nikerabbit (talk) 07:32, 11 April 2021 (UTC)

Inaccurate Coordinates for Passabe, East Timor

See Talk:Q2055747 for more details and an opportunity to resolve an inaccuracy. --Apisite (talk) 08:10, 30 March 2021 (UTC)

@Apisite: Feel free to edit the item to fix the inaccuracy. If you want to discuss individual data items or aspects of the Wikidata project policies pease do so at the Project chat. -Mohammed Sadat (WMDE) (talk) 17:09, 11 April 2021 (UTC)

I've noticed that the property is represented as string, which seemed odd. I asked in the Telegram chat, and @ArthurPSmith: let me know that Foundational Model of Anatomy ID (P1402) was close to the 99% cutoff criterion on 2016 (https://www.wikidata.org/wiki/Wikidata:Identifier_migration/1 ) (thanks!). Now it has by the counts, 79135 usages in 79094 items, which is (if I am correct) a uniqueness over 99.9%.

Would it be possible to change its data type to ID? Is there something I could do to help?

Thank you for the help!, TiagoLubiana (talk) 02:12, 13 April 2021 (UTC)

@TiagoLubiana: Actually that's not a measure of the unique aspect, but the degree to which it is single-valued. See the constraint violations report for P1402 - there are 64 "Unique value" violations and 41 "Single value" violations as of the time of that report. So yes it's still good to 99.9% - nevertheless you might want to go through those violations and verify they should be fixed (or are real violations of this aspect of being identifier). ArthurPSmith (talk) 17:28, 13 April 2021 (UTC)
@ArthurPSmith: Oh, I see, thank you for the clarification. I've checked a couple of them and they seem fixable. They will need a general clean up later (e.g., to account for the species-specificity of FMA and species-neutrality of terms like "brain"), but they do not seem like real violations. TiagoLubiana (talk) 17:43, 13 April 2021 (UTC)
  Support this - to be clear to the developers :) ArthurPSmith (talk) 20:42, 13 April 2021 (UTC)

Election misinformation vulnerability

My country (Mexico) will hold democratic elections in June. As it is well known, a big problem of campaigning season is misinformation. I recently searched a political candidate's Wikipedia page on my mobile, and found that the description field had fake and harmful statements.

Here's a screenshot of it: https://ibb.co/n8PP49G

FYI: They're calling him a clown, saying he's homosexual (which is not true), and mocking him for singing badly.

I was confused when I saw this, since I know Wikipedia locks politicians' pages from being edited in elections season precisely to prevent this type of vandalism. So I researched and found out in this thread that Wikipedia's mobile version shows a description field which is pulled from Wikidata.

After finding out about this, I immediately logged into Wikidata and corrected the information.

Since Wikidata doesn't have restrictions to prevent election misinformation, anyone can maliciously modify the description field, and consequently, it will show up in Wikipedia's mobile version, which is dangerous considering Wikipedia's widespread use to get informed on elections.

Please take measures against election misinformation, I believe it has been world-widely proved that it can have catastrophic consequences.

In this case, I would suggest that during election season, either block editing on politicians' data, or stop feeding Wikipedia's mobile version description field.

@Adan lopez alat: This thread does not concern the development team and would have been better raised at the Administrators' noticeboard. Our protection policy does not encourage pre-emptive protection of items, so we generally protect only when there is already a pattern of local vandalism. I do see vandalism from a couple of IPs, so I have given this item two weeks of semi-protection. Thanks for stepping in a fixing the vandalism. Bovlb (talk) 22:04, 19 April 2021 (UTC)

Entity suggestions from allowed qualifiers constraint

Somehow this doesn't seem to work as described at Help:Property constraints portal/Entity suggestions from constraint definitions.

Sample: clicking edit on Q306527#P6720 only suggests the formatter URL, not others from allowed qualifier constraint on Property:P6720#P2302.

Can you double check and fix (the documentation or the feature), preferably the later. --- Jura 10:02, 31 March 2021 (UTC)

Thanks for reporting it, I created a ticket so we can work on fixing the feature. -Mohammed Sadat (WMDE) (talk) 10:49, 20 April 2021 (UTC)

Restart WDumper (Q83952948) on toolforge.org

The tool seems to be stuck since January.

I left note at User_talk:Bennofs who I think wrote it and there are also reports on github. However, the user doesn't seem to be active recently.

Can you liaise with whoever can restart it? It's a tool that might be a useful addition to Wikibase. --- Jura 20:21, 4 April 2021 (UTC)

Hello Jura, thanks for taking the effort to reach Bennofs, it seems to me that they have abandoned their tool. There's an Abandoned tool policy that directs how to deal with and restore such tools. Would you like to take this to the community and try and find someone interested in adopting and maintaining it? -Mohammed Sadat (WMDE) (talk) 11:11, 20 April 2021 (UTC)

BlazeGraph Support

I'm new to any of Wikimedia's work (other than being a user) meaning I have never contributed before. With this as the introduction :

I see that the main data store for all data in WikiData is Blazegraph. I also see that this "was/is" an open source and has not been updated from last year. The reason, looks like, the Blazegraph team acquisition by Amazon Neptune. I saw a "new" initiative of WikiData (architecture) which is still showing BlazeGraph (may be a prototype as I saw the team was experimenting). Given all this, are there team members/volunteers within the WikiData ecosystem who can support the issues within BlazeGraph (I mean the issues within BlazeGraph)?

I'm posting on the development team page as I don't know any better.

Also, when I login and try to access this page why does it look different I cannot share here as I have to be logged in and if I do login I dont get this page :)

Thanks,

- Bhavani (BYerrapalli)

The following text was appended to an earlier discussion by the same user. I have moved it into this thread as it may be related. Bovlb (talk) 21:57, 19 April 2021 (UTC)

NEVER MIND THE BELOW MESSAGE , I THINK THIS IS THE USER GROUP OF MediaWiKi AND WHICH IS WHERE I NEED TO POST THIS MESSAGE. But the issue with the page still stands and why is my IP address being displayed on the website when not logged in? I have no problem, though for most the users who probably don't understand machines identity is being thrown out in the open it could be a possible threat.  – The preceding unsigned comment was added by 171.61.87.236 (talk • contribs).

Blazegraph is the backing store for Wikdiata Query Service. Wikidata itself is built on top of Mediawiki, which is backed by MariaDB for storage. All that being said, yes, Blazegraph is still part of the Wikidata ecosystem, and is not well maintained. The Search Platform team (the team at Wikimedia Foundation which is responsible for WDQS) has some limited expertise on Blazegraph. One of the things we need to be working on soon is investigate the potential alternatives to Blazegraph as an RDF data store. --GLederrey (WMF) (talk) 13:52, 20 April 2021 (UTC)

Property suggestions: do not suggest P646/P2671

When doing the monthly update of properties to be suggested, please remove:

These are mostly added by bot and might just confuse users. The bot doesn't need the feature designed for manual input.

If you consider them helpful and think they should be included, please explain why. --- Jura 10:31, 18 April 2021 (UTC)

Indeed, they should not be showing up. We will look into it.-Mohammed Sadat (WMDE) (talk) 11:56, 21 April 2021 (UTC)

Sitelink not removed when the article is deleted

In Q4058121, the English article was deleted in 2019, but the sitelink is still in the item. On Project Chat it was suggested that this is a glitch, bringing it here in the (unlikely) case this is a systemic problem.--Ymblanter (talk) 18:43, 18 April 2021 (UTC)

@Ymblanter: Thank you for bringing this to our attention. The last time that this happened was several years ago, I started this ticket so we can keep track of similar occurrences and investigate if we notice other examples. -Mohammed Sadat (WMDE) (talk) 13:42, 21 April 2021 (UTC)
Thanks.--Ymblanter (talk) 19
04, 22 April 2021 (UTC)

There is a problem between the constraints depending on where the properties are used.

Hello Lucas Werkmeister (WMDE)  ,
Without my message, I thought it would be fixed after May 2018, step by step, but I haven't seen any post for this issue. subject type constraint (Q21503250) should only report an error when used in Wikibase item (Q29934200) and not in the rest of allowed-entity-types constraint (Q52004125). Obviously, constraint scope (P4680) does not work, because it is completely useless in all cases… Make other constraints (corresponding to Q21503250) for the rest of the values allowed by Q52004125 if you want, but restrict subject type constraint (Q21503250) to Items please. For example, there is also property scope constraint (Q53869507) which is problematic on the rest of the allowed-entity-types constraint (Q52004125) for the same reason. Cordially. —Eihel (talk) 06:33, 15 April 2021 (UTC)

Hi Eihel, I reread your message but your request is not clear to me. Are you saying that type constraints should only be checked on items, but that constraint should then not be checked if the same property is used on a lexeme for example? I'm also not sure which message from May 2018 you're referring to, so if you could link it here that'd be helpful. Thanks! -Mohammed Sadat (WMDE) (talk) 08:18, 26 April 2021 (UTC)

Vengo a decirle administrador que traté de enlazar la página en inglés de devil (Star Trek) con la pagina en español pero no sé que me pasa. Puedes ayudarme. Está en el articulo Devil's Due de Star Trek: la próxima generación.

Vengo a decirle administrador que traté de enlazar la página en inglés de devil (Star Trek) con la pagina en español de Devil (Q101206419) pero no sé que me pasa. Puedes ayudarme. Está en el articulo "Devil's Due" (Star Trek: The Next Generation) de Star Trek: The Next Generation donde dice Monster Devil. Thanks you. ( 152.206.236.96 20:14, 22 April 2021 (UTC)).

You can better post your question on WD:PC. This question is not for the developers, but for the community. Mbch331 (talk) 18:56, 23 April 2021 (UTC)