Open main menu
Babel user information
ru-N Для этого участника русский язык является родным.
en-3 This user has advanced knowledge of English.
de-1 Dieser Benutzer beherrscht Deutsch auf grundlegendem Niveau.
fr-1 Cet utilisateur dispose de connaissances de base en français.
Users by language

"Move fast and break things. Unless you are breaking stuff, you are not moving fast enough" (с)


Why am I here?Edit

Most certainly you are here because you've spotted my incorrect edit like this or this.

Are you Russian troll?Edit

Contrary to some people beliefs, I'm not - see article on CBC News (Q2931014)

What should I do with your mistakes?Edit

Please do not remove silently incorrect claim. If I will not be notified about my mistake, I will repeat it again. So please:

If you want to do one action and forget about it, please revert my erroneous edit (or write on my talk page). I'll take it from there, find and correct the cause of the problem and see if any other items are also affected.
If you want to help more, please do a little research to find a source of the problem:
once you find it, please go ahead and correct it
If you believe that value of certain property is unknown, it would be helpful if you will specify this explicitly (see Help:Statements#Unknown or no values). I will respect this, and will not update it in the future.

Importing unsourced claim from wikipedia is bad, isn't it?Edit

Some people certainly think so, but let me try to convince you that it's not. In the ideal world, where majority of wikidata claims would have corresponding reference to reliable source, my imports would certainly be considered evil. We are nowhere near that yet, wikidata is lacking billions of trivial facts. Some of those facts are easily extractable from infoboxes (see harvest template). Others are expressed in forms of categories (and regularly loaded via petscan). The only difference between what I'm doing is scale. For every HT run, you need to specify pair of wikidata property and infobox field on one wikimedia project. Every time you run petscan, you have to specify one or a few categories on one wikimedia project. Every time I run my script, I'm using 300K+ category items, that have sitelinks to some of 800+ mediawiki projects. As with any mass import, there is certain amount of errors associated with it. As you can see from my talk page and archives, I'm getting dozens of complains. I am reverting hundreds (if not thousands) of my erroneos edits, overall I've made 5M+ edits and at least 25% of them are performed using category imports.

How exactly do you create new statements?Edit

Let me show you real life example. If

I add corresponding statement there.

But you also update some statements, don't you?Edit

Yes I do. For example, of:

I narrow down corresponding statement.

What tool are you using?Edit

My own quick'n'dirty python script runs paginated queries using WD:WDQS + mw:MW2SPARQL federation endpoint.

I already perfected in WD all items that have statement XXX, why you are still importing them?Edit

Because it helps to spot errors in wikimedia projects and find candidates for merge. But if you believe its not applicable to your case, let me know, and I'll stop.

You do a lot of bot-edits, why not get request bot flag?Edit

Bot edits are considered as low-risk by community and therefore are less reviewed. My edits are by no means safe, I really want community scrutinize them and appreciate any input. Although I do some precautions not to overflood someones watchlist, sometimes it happens and I apologise for that.