There are three articles on huwiki that don’t get their badge on Wikidata since days. I downloaded and run the script, and it fails with ValueError: No JSON object could be decoded, each time at different places (some runs are even successful). Could you please look into this?
Topic on User talk:DeltaBot
Do you have some sample traceback? There are several places in the script where JSON is parsed and it could be an upstream problem.
Traceback (most recent call last): File "./pwb.py", line 257, in <module> if not main(): File "./pwb.py", line 250, in main run_python_file(filename, [filename] + args, argvu, file_package) File "./pwb.py", line 119, in run_python_file main_mod.__dict__) File "./scripts/badges.py", line 61, in <module> data = r.json() File "/home/user/.local/lib/python2.7/site-packages/requests/models.py", line 896, in json return complexjson.loads(self.text, **kwargs) File "/usr/lib/python2.7/json/__init__.py", line 339, in loads return _default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode raise ValueError("No JSON object could be decoded") ValueError: No JSON object could be decoded CRITICAL: Exiting due to uncaught exception <type 'exceptions.ValueError'>
I changed the script to comment out any edit-related lines (initializing site and getting token on the top, usage of the token, and sending the edit API request) in order to make sure to avoid any accidental edits, and also added some debugging output; line 61 in my script is line 58 in yours. Using the debug output, it seems that it stopped at the fourth enwiki adding turn (good articles), after trying to remove 346 good article badges from huwiki articles (this is all but one good articles on huwiki)… (By the way, don’t you have logs of the bot? Or it fails silently on your end?)
Since I'm not running the bot, I cannot access its logs. The first problem at glance could be Python 2.7 version. But this is up to the maintainer. I also suspect it can be some problem in PetScan, perhaps an internal error or query timeout.
Traceback (most recent call last): File "pwb.py", line 257, in <module> if not main(): File "pwb.py", line 250, in main run_python_file(filename, [filename] + args, argvu, file_package) File "pwb.py", line 119, in run_python_file main_mod.__dict__) File "./scripts/badges.py", line 33, in <module> data = r.json() File "/usr/lib/python3/dist-packages/requests/models.py", line 897, in json return complexjson.loads(self.text, **kwargs) File "/usr/lib/python3.5/json/__init__.py", line 319, in loads return _default_decoder.decode(s) File "/usr/lib/python3.5/json/decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) CRITICAL: Exiting due to uncaught exception <class 'json.decoder.JSONDecodeError'>
It’s essentially the same, only some file paths and line numbers differ, including the line number of badges.py, which failed in this run in the first loop (badge removal), but, as I mentioned, this always changes (this time it was the third enwiki turn, featured articles). 3.5.3 is the latest stable version of Python 3 on Debian 9 “stretch”.
This happens if Petscan does not return any results, e.g. if the query times out.