Wikidata:Bot requests/Archive/2014/05

Townships in South Dakota merge job

This section was archived on a request by: Multichill (talk) 09:08, 29 May 2014 (UTC)

Looks like most articles in es:Categoría:Municipios de Dakota del Sur and vi:Thể loại:Xã thuộc tiểu bang Nam Dakota are not connected. Multichill (talk) 21:24, 27 May 2014 (UTC)

@Multichill:   Done (21 pages connected, 792 mergers + a few by hand); Fairfax (Q2002333) and Fairfax Township (Q5478318) could conflict. --Ricordisamoa 14:53, 28 May 2014 (UTC)
Nice! Thank you Ricordi. You might find more if you dig around a bit more in the related categories. Multichill (talk) 16:25, 28 May 2014 (UTC)
Nevermind. I clicked around a bit in the subcategories of vi:Thể loại:Xã của Hoa Kỳ and all seem to be connected. Multichill (talk) 16:35, 28 May 2014 (UTC)

Importing images from da.wikipedia

Hi,

not sure if this is possible or not, but there is a category on da.wikipedia with about 5000 pages that have images, where wikidata does not have images [1]. If these could be brought into Wikidata that would be great. Sincerely, Taketa (talk) 21:22, 28 May 2014 (UTC)

Running. Multichill (talk) 09:04, 29 May 2014 (UTC)
  Done @Taketa: leftovers should be checked by hand. Multichill (talk) 10:13, 30 May 2014 (UTC)
This section was archived on a request by: --Pasleim (talk) 21:33, 27 June 2014 (UTC)

Connection of it.wiki categories

There are ~180 categories on it.wiki of the form Categoria:Trasporti nel <year> (from it:Categoria:Trasporti nel 1830 to it:Categoria:Trasporti nel 2014), all without wikidata items. Each one should be connected to the corresponding en.wiki page Category:<year> in transport (e.g. en:Category:1830 in transport). Would it be possible to do it with a bot? Thanks in advance--Dr Zimbu (talk) 20:29, 31 May 2014 (UTC)

@Dr Zimbu:   Done 185 changes. --Ricordisamoa 00:54, 1 June 2014 (UTC)
This section was archived on a request by: --Pasleim (talk) 21:31, 27 June 2014 (UTC)

Importing coordinates from Spanish Wikipedia

If it is possible I would like to ask someone with bot to import into wikidata the coordinates from the pages of the categories w:es:Categoría:Wikipedia:Artículos con Ficha de entidad subnacional por trasladar coordenadas a Wikidata and w:es:Categoría:Wikipedia:Artículos por trasladar coordenadas a Wikidata. Thanks. --Agabi10 (talk) 14:22, 27 May 2014 (UTC)

Yes, I can import that. No problem. You probably want to update es:Módulo:Coordenadas to add a category like Category:Coordinates not on Wikidata (Q15181099).
Enabling it isn't that difficult. You just have to get es:Módulo:Coordenadas modified like en:Module:Coordinates (example). It looks if the coordinates are displayed in the title and than looks for coordinate location (P625) to add the different tracker categories. Multichill (talk) 18:35, 27 May 2014 (UTC)
  Done @Agabi10: a lot of the leftovers have invalid coordinates. Multichill (talk) 09:01, 29 May 2014 (UTC)
@Multichill: And what is the problem the bot found to tell they are invalid? Because I try to put some of them and I could without any problem. --Agabi10 (talk) 12:28, 29 May 2014 (UTC)
@Agabi10: see for example es:Chavezpamba (parroquia) -> "Coordenadas: segundos de latitud >= 60 {{#coordinates:}}: latitud no válida". Multichill (talk) 12:53, 29 May 2014 (UTC)

@Multichill: OK, thanks for the example. I'll comment it in the spanish wikipedia and depending on what they say I'll close this request. Thank you. -- Agabi10 (talk) 14:23, 29 May 2014 (UTC)

@Multichill: They said me they'll try to repair them with a bot in the cases that it's possible. When it's done I'll tell you on your user talk page when it's done to run again your bot.
This section was archived on a request by: Agabi10 (talk) 10:26, 1 July 2014 (UTC)

"country of citizenship" <-- "place of birth"

Would it be possible to create a bot to add country of citizenship (P27) based in value of place of birth (P19)? For example, I added "country of citizenship:Brazil" in an item where "place of birth:<some city for which 'country:Brazil'>" Helder.wiki 13:28, 24 May 2014 (UTC)

I would be very cautious with that. In many countries, you do not automatically get citizenship just because you were born in the country. --Zolo (talk) 14:23, 24 May 2014 (UTC)
Not only do many countries not have jus soli (Q604971), but even countries that have it sometimes have exceptions: For example, a child of a foreign diplomat is often exempt; the child is either not eligible for jus soli, or is presumed not to have local citizenship unless the child or parents assert that the child wants to retain it. A more common problem is that people sometimes renounce citizenship rather than retaining multiple citizenship (Q756296); I suspect this happens pretty often with notable people that have moved to other countries. Some other people lose citizenship against their will: see statelessness (Q223050). --Closeapple (talk) 00:44, 27 May 2014 (UTC)
This section was archived on a request by:
  Not done per Zolo and Closeapple. --Ricordisamoa 16:41, 17 July 2014 (UTC)

Add IMSLP ID property

Dear all,

In Wikpedia EN, there is a nice template w:Template:IMSLP

That is used

{ { IMSLP|id=Dvořák, Antonín|cname=Antonín Dvořák } }

   or

{ { IMSLP|Dvořák, Antonín } }

in the corresponding WIKIDATA article it should become

IMSLP ID (P839) : Dvořák, Antonín

And it's done !

Thanks

(by User:Xmlizer, 30. März 2014, 22:53 Uhr)

Note: IMSLP ID (P839), import "Category:Dvořák, Antonín" etc. en:Antonín Dvořák for example is outdated. (No bot import in this case possible.) --Kolja21 (talk) 00:02, 31 March 2014 (UTC)
Indeed, it was a very bad example. please use all the other https://en.wikipedia.org/w/index.php?title=Special:WhatLinksHere/Template:IMSLP&limit=1000 Xmlizer (talk) 22:27, 1 April 2014 (UTC)
How can I know, that en:Antonín Dvořák was the only outdated article? --Pasleim (talk) 12:09, 6 May 2014 (UTC)
Pasleim: Maybe you can spot them with the TemplateTiger? Also Template:IMSLP2 (Q7532243) should be imported (for works).--Micru (talk) 13:50, 6 May 2014 (UTC)

IMDb

fr:Modèle:Imdb nom and de:Vorlage:IMDb Name could be used for IMDb ID (P345) ----- Jura 03:48, 10 May 2014 (UTC)

Harvest defunct data from Wikipedia

Hello, over at Wikipedia Wikiproject Video games, we need to clean up our article code and remove a hell of a lot of redundant data.

Since the project started, a number of template fields have been added and removed from the main article template. We now have 15 defunct fields that still appear in the code for some article pages, and we're now in the position where all of this old data is getting in the way and making things confusing for new users (They copy over template code from existing articles only to find that some template fields aren't working after they have populated them with data.)

Initially we were just going to delete the data, but a request to save it for Wikidata was made, which is why I'm here.

In order to aid all users (especially new ones) in editing the infobox code, and at the same time preserve the data, we need a bot run by Wikidata to harvest and then remove the defunct data fields.

As we have over 11,000 articles that need this process carried out on them, a bot really is the only way of collecting this data.

We have a tracking category that lists every article that needs editing.

The discussions around this job are at the following:

1

2

I've also put the details in a table to make things easier to read.

Its not going to be easy as some of the fields contain user's own unique - and sometimes differing - formatting styles, but we know you'll find a way to cope with it. Hope you can help. - X201 (talk) 08:25, 10 April 2014 (UTC)

Sorry to press for a reply, but we need to start deleting these; probably next month. Could someone tell me if this can/will be done, or if collecting from the article edit history is just as easy for you? - X201 (talk) 13:07, 17 April 2014 (UTC)
@X201: I'm importing properties from the following fields: input input device (P479), license copyright license (P275), version software version identifier (P348), 'preceded by' follows (P155), 'followed by' followed by (P156), website official website (P856).--Underlying lk (talk) 06:30, 4 May 2014 (UTC)
Thanks Underlying lk! Release date(s) would also be valuable, as maintaining them locally is a pain. From some past discussions I gather one can use significant event (P793) with appropriate qualifiers, but 577 may be of use too, I don't know. --Nemo 07:25, 4 May 2014 (UTC)
@Underlying lk:Great, thanks (and indeed, great thanks). Could you ping me to let me know when you're done please. I can get BotReq on Wikipedia to start the clean up over there after your run is complete. Does anyone think any of the other defunct fields are of use to WikiData? - X201 (talk) 09:50, 5 May 2014 (UTC)
@X201:   Done, let me know if more fields need to be imported.--Underlying lk (talk) 06:17, 6 May 2014 (UTC)
@Underlying lk:Thanks. I have no idea what fields need importing as I'm not from round this parts, I'm just here to alert Wikidata to the ability to salvage some of it before its removed from Wikipedia. Personally, the only other field I can think WikiData might find useful is the Ratings field, but I don't know if anyone would want you to import it. - X201 (talk) 15:46, 7 May 2014 (UTC)
@Nemo_bis: I already imported publication date (P577) from all the video game infoboxes on the English Wikipedia.--Underlying lk (talk) 06:17, 6 May 2014 (UTC)

If anyone wants any of the other fields importing, speak now, because we're going to start removing them from Wikipedia next week. - X201 (talk) 11:48, 12 May 2014 (UTC)

date of birth

For Q1062402, there is the date of birth in de:Template:Personendaten. Maybe others could be imported as well. --- Jura 08:10, 10 May 2014 (UTC)

Place of birth, date of death, and place of death could probably be imported from there as well. ----- Jura 17:02, 10 May 2014 (UTC)

Stagnating fall of 0-statement items

The reduction of of 0-statement items has been stagnating in the last months. The last statistic even stows them growing again (http://tools.wmflabs.org/wikidata-todo/stats.php - see "Statements per item"). I was thinking that we should try to add at least one statement to those 5 million items till the end of the year. A translation in one of the major European languages would also be helpful. A good way of accessing 0-statement items is this tool (http://tools.wmflabs.org/wikidata-todo/important_blank_items.php). Some items are fairly exotic, but there would be opportunity for bot-edits. Here are some of them:

  • IC 2879 (Q3688473) - A lot of items labelled "IC 1234". Some information could be pulled from e.g. sr-wiki infobox on astronomical objects.
  • Photoisomerase (Q7316766) - It seems that the en-wiki infobox for enzymes has not been copied by bots. Fetching the CAS-number would be helpful so that Chemistry and Molecular-Biology WikiProjects could work on constraint violations.
  • NGC 930 (Q666669) - Many items labelled "NGC 12345" are also referring to astronomical objects and have a variety of infoboxes that could be matched against each other.
  • Taxtakəran (Q3674627), Udovičić (Q11185246) - Some countries still have geographic infoboxes that were not aquired yet.
  • Highway H23 (Q1961843) - Some wikis still have unaquired road infoboxes.

Adding one or two unsourced statments goes along with the whole Widar-trend. I still think it is helpful to do a rough categorization (even with missing sources), because it becomes easier for people to find the items they want to improve with sources. Also we shouldn't be ignoring 1/3 of our items. Tobias1984 (talk) 13:54, 12 May 2014 (UTC)

NLA Persistent Identifier

New property: NLA Trove people ID (P1315).

It would be nice if a bot take a look at items with Libraries Australia ID (P409) and add the new property. --Kolja21 (talk) 18:50, 17 May 2014 (UTC)

Updating category and template labels

Is there a bot operator whose bot would repeatedly process the following task?

When a category/template link is moved, the label should also be moved (i.e. deleted and recreated elsewhere). This isn't done often, though. I want the bot to (1) delete labels which are/should be placed elsewhere (regardless capitalisation of the first letter or the letter after the first colon) and (2) update labels of items which are linked with categories with changed titles. Matěj Suchánek (talk) 12:28, 30 March 2014 (UTC)

ping @Akkakk:, your bot is working on labels, could you please take a look on this? Matěj Suchánek (talk) 16:23, 25 May 2014 (UTC)
my bot don't watch recentchanges, but get items from the database. so it's more than just a small change to the code. i won't do that at the moment. (the code is published, so maybe someone else want to take this task)--Akkakk 17:18, 25 May 2014 (UTC)
My English is very interesting... I mean regular running bot, not watching RC. Matěj Suchánek (talk) 19:32, 25 May 2014 (UTC)

Fix italian born lists

There is a number of items referencing itwiki with names like "Nati nel <number>". I suggest that the following be done:

Manual example (no merge): [2],[3] GranD (talk) 16:41, 20 May 2014 (UTC)

The relevant categories to get all such lists are w:it:Liste di morti nell'anno‎ (deaths) and w:it:Liste di nati nell'anno‎ (births). --Nemo 15:46, 23 May 2014 (UTC)

Estonian biographical dictionary

I see a lot of et.wiki articles with w:et:Mall:ETBL lack properties on Wikidata. It's a tag for content from a biographical dictionary, so it seems safe to add instance of (P31) > human (Q5) to all of them. --Nemo 15:42, 23 May 2014 (UTC)

Remove sources for categories, templates, portal, etc items

I think will be good idea to remove source for Property:P31 values of category, template, portal, etc. Of course will be good idea to check that all links are from particular namespace. --EugeneZelenko (talk) 15:02, 6 March 2014 (UTC)

This week end my bot will work on category to fix an old error, If no one oppose I can delete instance of (P31) source of instance of (P31) on categories. I can do it also for Template and Portal. --ValterVB (talk) 19:33, 6 March 2014 (UTC)
Please not that I asked to delete source, not property value itself. --EugeneZelenko (talk) 15:19, 7 March 2014 (UTC)
Of course I mean delete source of instance of (P31) :) --ValterVB (talk) 16:01, 7 March 2014 (UTC)
Why do you think this would be a good idea?
First I get shouted at for not including the source and now you want to remove it? No, let's not do this. @ValterVB: please stop doing this. Multichill (talk) 11:42, 8 March 2014 (UTC)
@Multichill: For now I remove only a wrong source: example (my old error). If there isn't consensus, I don't delete source. --ValterVB (talk) 12:29, 8 March 2014 (UTC)
Support deleting. We don't need sources for categories, templates and project pages etc. It is clear that they've been imported from Wikipedia, so what's the sense of adding a source to say this? --Stryn (talk) 12:23, 8 March 2014 (UTC)
  Strong oppose, sources are useful to detect errors. for example, If a bot add p31=disambigion page to an item which is not linked to a disambigion page in enwiki, it's useful to detect this page in which language is a disambigion. And it's also useful to detect this problem (now cleaned).--GZWDer (talk) 12:30, 8 March 2014 (UTC)
We need dedicated check for disambiguation status in projects directly. Source will not help. --EugeneZelenko (talk) 15:50, 8 March 2014 (UTC)
  •   Oppose per GZWDer: Sources aren't just for verification of correct data; they help users track down the reasons for bad data. I have trouble understanding why someone would want to remove sources en masse, even if it's a trivial source. Removing wrong sources is OK; removing true sources is not OK. --Closeapple (talk) 03:04, 27 May 2014 (UTC)

Rotten Tomatoes

For Rotten Tomatoes ID (P1258), there is Template:Rotten Tomatoes (Q5615409). ----- Jura 07:47, 10 May 2014 (UTC)

When running a bot for this, remember that sometimes Wikipedia articles contain Rotten Tomatoes references (or other database references) to different movies or people, besides the main topic of the article. --Closeapple (talk) 03:44, 28 May 2014 (UTC)

link enwiki article to items

See User:GZWDer/temp16. Note:

  1. There're some false positives in "Genus and other" section.
  2. There may be also some homonymous species in "Species" section.
  3. If there're more than one item after links, all such items should be merged too.

--GZWDer (talk) 10:54, 29 May 2014 (UTC)

@GZWDer: how reliable are those data? --Ricordisamoa 00:49, 1 June 2014 (UTC)
@Ricordisamoa: This data is only from page title and labels/aliases. To be sure, please only link items with at least one sitelink whose page name is the same as page name in enwiki in "Species" section. "Genus and other" section should not be done for the time being.--GZWDer (talk) 08:29, 1 June 2014 (UTC)
I can handle the species section. But this has to wait until I could report all of my yesterdays merges. The rest is a little bit more problematic, but I have some ideas. --Succu (talk) 08:36, 1 June 2014 (UTC)
  Doing… the "Species" section. Skipping Q286251 since it matches "Coloborhynchus araripensis" and "Coloborhynchus clavirostris". --Ricordisamoa 12:17, 1 June 2014 (UTC)
This section was archived on a request by: --Pasleim (talk) 13:14, 11 November 2014 (UTC)

Remove descriptions consisting of language names only

Based on Wikidata:Bot requests/Archive/2014/01#Remove English descriptions consisting of language names only. After three months, we expanded the filter a bit. Is it possible to run a bot again for all descriptions and aliases (for labels there's false positives)? Matěj Suchánek (talk) 10:50, 8 May 2014 (UTC)

ping @Bene*:, you had done this task. Matěj Suchánek (talk) 16:18, 25 May 2014 (UTC)

Akkakk, your bot has been working on labels etc. recently. Could you please take a look on the task? Matěj Suchánek (talk) 18:48, 10 June 2014 (UTC)

  Done did it with a slighty modified regex to avoid false positives. --Pasleim (talk) 10:04, 17 November 2014 (UTC)
This section was archived on a request by: --Pasleim (talk) 18:42, 6 December 2014 (UTC)

Importing coordinates from Polish Wikipedia

I've noticed that geographical coordinates have already been imported to Wikidata from most language versions of Wikipedia, but I can hardly see any imported from my own home wiki, which is Polish Wikipedia. We've got really loads of articles with coordinates, so I think it would be really beneficial to bring it all to Wikidata. Thank you in advance. Powerek38 (talk) 08:50, 24 May 2014 (UTC)

Hi Powerek38, I've been importing coordinates from several Wikipedia's. Maybe you could set up Wikidata:Coordinates tracking at the Polish Wikipedia? After that I'm more than happy to do the import. Multichill (talk) 09:06, 24 May 2014 (UTC)
Thanks a lot Multichill, I'm not really a very technical user, so I've just set up a discussion in our coordinates wikiproject, so that my more able collegues can check if we can meet this requirement and how to do that. Feel free to join in if you have any hints, all members of that project understand English with no problems. Powerek38 (talk) 09:28, 24 May 2014 (UTC)
@Powerek38: Enabling it isn't that difficult. You just have to get pl:Moduł:Koordynaty modified like en:Module:Coordinates (example). It looks if the coordinates are displayed in the title and than looks for coordinate location (P625) to add the different tracker categories. I'll post this at the Polish Wikipedia too. Multichill (talk) 09:38, 25 May 2014 (UTC)
Ok. The category is created and I'm now importing. You'll see a error message on the coordinates. That's already fixed in the code (see bugzilla:62105 ) and will probably be deployed soon(ish). Multichill (talk) 18:27, 27 May 2014 (UTC)
Thank you very much for your help on this, Multichill! Powerek38 (talk) 16:49, 30 May 2014 (UTC)

We still have the same problem in cawiki. Although Ladsgroup and others did some work in it, there are still over 15.000 articles with coordinates to be imported in ca:Categoria:Articles amb coordenades sense coordenades a Wikidata (a few of them are people with burial places, but most articles in category are places to be imported). @Multichill:--Pere prlpz (talk) 15:56, 8 September 2014 (UTC)

I wrote and usecoordinate_import.py. It's part of pywikibot. Anyone else with a Pywikibot can easily do this (pwb.py coordinate_import.py -family:wikipedia -lang:ca -cat:Categoria:Articles_amb_coordenades_sense_coordenades_a_Wikidata). Maybe someone else wants to try this to get the hang of it?
I'm not sure when I get to it myself. Multichill (talk) 19:32, 8 September 2014 (UTC)
@Multichill, Pere prlpz: My bot is doing it now. I hope there aren't too many items in this category which shouldn't have coordinates... — Ayack (talk) 19:45, 8 September 2014 (UTC)
Great Ayack! bugzilla:65430 got fixed so if you import coordinates on persons or companies, people can move them to be qualifiers to for example the place of death and these won't be imported again. Multichill (talk) 19:49, 8 September 2014 (UTC)

@Ayack: ca:Categoria:Articles amb coordenades sense coordenades a Wikidata has grown again to 852 articles with coordinates in cawiki without coordinates in Wikidata, most of them cultural heritage monuments. Could you import coordinates again?--Pere prlpz (talk) 14:03, 8 October 2014 (UTC)

  Done --Pasleim (talk) 21:24, 5 March 2015 (UTC)
This section was archived on a request by: --Pasleim (talk) 21:24, 5 March 2015 (UTC)

Intel processors

Hello. Can some bot please read Intel's site http://ark.intel.com and gather information from that database for the Wikidata? Its needed that it catches the name of the processors, create an item corresponding to each one and complete the item with the properties socket supported (P1041), instruction set (P1068), manufacturer (P176) and number of processor cores (P1141).--MisterSanderson (talk) 15:58, 17 May 2014 (UTC)

Why don't you write them to ask that they release that data via a dump/machine readable API under a free license (or rather CC-0)? Even better, they could add it themselves here on Wikidata, to save us some work. --Nemo 17:16, 17 May 2014 (UTC)
I could not find an appropiate e-mail adress at http://www.intel.com/content/www/us/en/company-overview/contact-us.html, so there is no way to contact them.--MisterSanderson (talk) 18:45, 17 May 2014 (UTC)
Try any of the first four in "Intel PR Departments" [4] (calling yourself an analyst) and [5], you'll be fine. --Nemo 15:51, 23 May 2014 (UTC)
Ok, I sent them a message.--MisterSanderson (talk) 11:37, 25 May 2014 (UTC)
The contact was closed without response.--MisterSanderson (talk) 16:50, 29 May 2014 (UTC)
So the creation of the items needs to be made by Wikidata robots...--MisterSanderson (talk) 15:32, 4 July 2014 (UTC)

Today I found the link "Export Full Specifications" that generates a XML file with the data. I think this will turn easy to gather the information with bots.--MisterSanderson (talk) 15:06, 2 October 2014 (UTC)

Here, I even extracted myself manually the data and created a table: http://hypervolution.wordpress.com/2014/10/01/soquete-lga771/. I think that now there is no excuse to not include these informations on Wikidata.--MisterSanderson (talk) 18:52, 3 October 2014 (UTC)

The table looks good. However, we can't yet add values with a dimension (e.g. Hz, MB, nm) so the only information we can now extract is the number of cores (number of processor cores (P1141). Are there already items on Wikidata about intel processors or should a new item be created for every row in the table? --Pasleim (talk) 19:15, 3 October 2014 (UTC)
Not only number of processor cores (P1141), there are other properties too: socket supported (P1041), instruction set (P1068) and manufacturer (P176). I think that maybe there is a "release date" property too, but I could not find. And there is the subclass of (P279): all Celeron models are a subclass of the Celeron family. Some processors already have an item, but in Wikipedia is more common to create articles about a family of processors, not to individual models, so I think that each row must be an item.--MisterSanderson (talk) 22:39, 3 October 2014 (UTC)

New table: https://hypervolution.wordpress.com/2014/11/01/soquete-lga775/ .--MisterSanderson (talk) 23:20, 6 December 2014 (UTC)

New table: https://hypervolution.wordpress.com/2015/01/01/soquete-fclga1366/ .--MisterSanderson (talk) 17:02, 1 February 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/02/01/soquete-fclga-1567/ .--MisterSanderson (talk) 22:41, 1 February 2015 (UTC)

You can add the data by yourself using http://tools.wmflabs.org/wikidata-todo/quick_statements.php . However, it is still not possible to add columns 3-6 to Wikidata as there is no support for quantities with units. Adding socket supported (P1041) and instruction set (P1068) can be interesting but I do not find these data on your page. --Pasleim (talk) 13:54, 3 February 2015 (UTC)
This tool only add statements to already existing items. But there are not items for all these processors. That's why I need that a robot creates them, I don't want to create manually, I already did my part of the job by creating the tables from the ARK. socket supported (P1041) is the title of the posts, and instruction set (P1068) is not available for all the processors. There is too little information for processors released before 2006. --MisterSanderson (talk) 18:11, 2 March 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/03/01/soquete-lga-1156/ .--MisterSanderson (talk) 18:11, 2 March 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/04/01/soquete-lga-1155/ .--MisterSanderson (talk) 21:14, 2 April 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/05/01/soquete-pga-478/ .--MisterSanderson (talk) 00:35, 11 May 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/06/01/ppga-423/ .--MisterSanderson (talk) 03:20, 15 June 2015 (UTC)

What's the point of saying that every month? Sjoerd de Bruin (talk) 07:29, 15 June 2015 (UTC)
I'm not saying the same thing each month, I'm notifying a new table every time.--MisterSanderson (talk) 00:19, 22 June 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/07/01/fclga-1150/ .--MisterSanderson (talk) 01:38, 12 July 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/08/01/soquete-ppga604/ .--MisterSanderson (talk) 02:27, 23 August 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/09/01/fclga-1248/ .--MisterSanderson (talk) 02:26, 31 October 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/10/01/soquete-pga988/ .--MisterSanderson (talk) 01:35, 3 November 2015 (UTC)

New table: https://hypervolution.wordpress.com/2015/11/01/ppga-370/ .--MisterSanderson (talk) 19:12, 3 November 2015 (UTC)


There appears to be no interest of users to do this job. I'm going to close this now. Sjoerd de Bruin (talk) 12:21, 3 December 2015 (UTC)

This section was archived on a request by: Sjoerd de Bruin (talk) 12:21, 3 December 2015 (UTC)