PropertyCreator cannot handle more than 60 requests/minEdit

I tried using the property creator gadget to make a property, but it required more than 60 requests to fill in everything. Since the Wikidata ratelimit is 60 req/min, the script returned an error instead of waiting. RPI2026F1 (talk) 19:52, 28 December 2022 (UTC)

While I'm here I might as well add some other notes:
  • Property creator cannot handle redirects, it will crash.
  • A potential fix for the 60 req/min solution might be to stack the edits so the edit operation occurs in one big edit. Alternatively, all the claims can be added at once, or all the references can be added at once.
RPI2026F1 (talk) 21:39, 28 December 2022 (UTC)
@RPI2026F1 hmm, I'm not sure how to handle those - I intended the script mostly for myself. If I have time I'll try to figure out batching, but I won't be able to get to it for a while DannyS712 (talk) 05:51, 29 December 2022 (UTC)
I'll take a look at the script and see if I can make some changes. RPI2026F1 (talk) 12:05, 29 December 2022 (UTC)
@DannyS712 here: User:RPI2026F1/PropertyCreator.js. I made modifications to the script so that it'll combine create claim entities into a single edit, so all qualifiers and references are created in one go. In addition, I also added the ability to add multiple values to the "Instance of" and "Subject type (domain) of this property" sections by separating the values with |. RPI2026F1 (talk) 20:52, 29 December 2022 (UTC)
Previous comment refers to this version. RPI2026F1 (talk) 03:14, 30 December 2022 (UTC)
@DannyS712 I've made further improvements, available at the latest revision of User:RPI2026F1/PropertyCreator.js. This time, the script makes all claims in the initial creation and only adds the examples afterward since it needs to know the property ID to do those. The examples are still batched so mainsnaks, qualifiers and references are done in every claim in a single edit. Example property with this new version. RPI2026F1 (talk) 04:28, 30 December 2022 (UTC)
My goal with doing this was that I wanted to stop the errors that would previously happen where a step would fail but leave a half-created property. With this change, either everything gets created or nothing gets created. RPI2026F1 (talk) 04:29, 30 December 2022 (UTC)
@RPI2026F1 I looked into the changes (https://www.wikidata.org/wiki/Special:ComparePages?page1=&rev1=1790578891&page2=&rev2=1798372445&action=&unhide=) and I have a few questions
  • why is a UUID v4 needed manually? Can you please add some documentation?
  • can you change sequentialPromises() to avoid async/await, and instead chain promises?
I'll also try to do some simplifications to the existing code to minimize the diff when I get a chance DannyS712 (talk) 00:39, 6 January 2023 (UTC)
The UUID v4 is needed because how the Wikidata frontend works for adding a claim is that it generates a new UUID for the claim. For some reason when you provide a nonexisting UUID Wikidata will make a new claim using that UUID. Also sequentialPromises() can be deleted entirely since it's no longer used. RPI2026F1 (talk) 00:52, 6 January 2023 (UTC)
@RPI2026F1 I couldn't find much documentation regarding `wbsetclaim` api action - do you have any documentation you can link to? DannyS712 (talk) 00:54, 6 January 2023 (UTC)
I didn't use any documentation when writing this logic. Instead, I opened my browser console and looked at the body of requests when editing statements on Wikidata Sandbox (Q4115189). RPI2026F1 (talk) 00:57, 6 January 2023 (UTC)
When it came to using wbeditentity, I looked at how Pywikibot was serializing claims. RPI2026F1 (talk) 00:58, 6 January 2023 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── @RPI2026F1: in that case, can you remove the `sequentialPromises()` function, and also can you link to the Pywikibot serialization code? Thanks, --DannyS712 (talk) 22:09, 12 January 2023 (UTC)

The code can be found at https://github.com/wikimedia/pywikibot/blob/57f208c4eeebb1486dd68ea93b18ebd58985d6c4/pywikibot/page/_wikibase.py#L1619.
    def toJSON(self) -> dict:
        """Create dict suitable for the MediaWiki API."""
        data = {
            'mainsnak': {
                'snaktype': self.snaktype,
                'property': self.getID()
            },
            'type': 'statement'
        }
        if hasattr(self, 'snak') and self.snak is not None:
            data['id'] = self.snak
        if hasattr(self, 'rank') and self.rank is not None:
            data['rank'] = self.rank
        if self.getSnakType() == 'value':
            data['mainsnak']['datatype'] = self.type
            data['mainsnak']['datavalue'] = self._formatDataValue()
        if self.isQualifier or self.isReference:
            data = data['mainsnak']
            if hasattr(self, 'hash') and self.hash is not None:
                data['hash'] = self.hash
        else:
            if self.qualifiers:
                data['qualifiers'] = {}
                data['qualifiers-order'] = list(self.qualifiers.keys())
                for prop, qualifiers in self.qualifiers.items():
                    for qualifier in qualifiers:
                        assert qualifier.isQualifier is True
                    data['qualifiers'][prop] = [
                        qualifier.toJSON() for qualifier in qualifiers]


            if self.sources:
                data['references'] = []
                for collection in self.sources:
                    reference = {
                        'snaks': {}, 'snaks-order': list(collection.keys())}
                    for prop, val in collection.items():
                        reference['snaks'][prop] = []
                        for source in val:
                            assert source.isReference is True
                            src_data = source.toJSON()
                            if 'hash' in src_data:
                                reference.setdefault('hash', src_data['hash'])
                                del src_data['hash']
                            reference['snaks'][prop].append(src_data)
                    data['references'].append(reference)
        return data
RPI2026F1 (talk) 15:17, 13 January 2023 (UTC)
@RPI2026F1   Done, synced your changes with a few modifications DannyS712 (talk) 01:15, 20 January 2023 (UTC)
Thanks! I'll adopt yours as well. Would you be open to further suggestions? I want to add some more fields but I'm not sure how to do that. RPI2026F1 (talk) 02:49, 20 January 2023 (UTC)
@RPI2026F1 not at the moment, because the overhead is pretty high for me, but if you want to make the suggestions I might be able to apply them sometime later DannyS712 (talk) 11:16, 20 January 2023 (UTC)

Request for undeletion of Wikidata ItemEdit

Hi DannyS712

Can you undelete item Q115559695 please?

This entry (for Jack Gilmore) is for one of the Trustees of the charity Code the City (Q97908064). In fact Jack is now the only trustee without a WD entry of all seven trustees back to 2014.

We'd just created Jack's item and hadn't had a chance to flesh it out in its entirety. He recently won | Open UK's Young Person award 2022 but there isn't yet a property for that.

Code the City is a Wikimedia UK partner and we do editathons, training, and hack events etc with WMUK. All of CTC's events (26 hack weekends, plus SODU annual unconference etc) | are in WD along with dates, themes, number of attendees etc - and we use that data for teaching WD querying.

Having this one gap in the charity data doesn't look good.

Happy to discuss further if you disagree with this request. Best wishes Ian Watty62 (talk) 15:53, 5 January 2023 (UTC)

@Watty62 Q115559695 has been restored, please flesh out the item to avoid this situation in the future, and please try to catch these at RFD rather than after deletion has occurred --DannyS712 (talk) 20:25, 5 January 2023 (UTC)
I will say RfD needs a complete overhaul because currently it doesn't notify the author when an item they made is up for deletion. RPI2026F1 (talk) 00:59, 6 January 2023 (UTC)
Thanks, both. Yes, RPI2026F1 - I was surprised that I had no notice. Watty62 (talk) 19:12, 6 January 2023 (UTC)