CPAN Testers is only made possible with the support of our sponsors.
For more information on sponsoring, please visit the I CPAN Testers website.

Upgrade Notice

The CPAN Testers Blog site has been upgraded since you last accessed the site. Please press the F5 key or CTRL-R to refresh your browser cache to use the latest javascript and CSS files.

News & Views

To begin with, a huge thank you to Nestoria, who selected CPAN Testers as their module of the month. Although we kind of break the mould of a module, we are exceptionally grateful for the donation via Gratipay, and for the promotion. As mentioned several times over the last few months, CPAN Testers are looking towards companies to help fund the servers we currently run, so donations from companies such as Nestoria are very gratefully received, and hopefully will help us to continue for many years to come. It is also nice to hear how CPAN Testers helped them too, when releasing Number-Format-SouthAsian.

We have been in discussions with a few other people and will have further announcements about sponsors and donations soon, but in the meantime, if you or your company would be willing to make a donation to CPAN Testers, you can do so via Gratipay, The CPAN Testers Fund, and/or by contacting me (Barbie), and we'll make sure you get suitable recognition for your contribution.

During December some initial thoughts were put together in preparation for the QA Hackathon 2015. The event in Berlin brings together several key people within CPAN Testers, as well as CPAN and Test communities, and will be an ideal opportunity to look at how we are processing the reports. In the first instance, I want to re-engineer the way the feeds are read, and how pages are built. Working with MessageQueues and Enterprise Service Buses recently, has hightlighted that we can make use of these technologies to speed processing up. As a consequence, I now have some areas of coding that I plan to start before the Hackathon, and hopefully provide some key parties a chance to review and improve the designs.

Although happening in January, I did want to pick up on an outage that happened earlier this month. My thanks to Bytemark for helping to resurrect the server for us. Thankfully, only one index table was knocked out, but after rebuilding, everything does appear to be back on track. Another task for the QA Hackathon is to look at a better way to reduce the risk for these indices. We also want to hopefully plan a move to MongoDB for the Metabase, which will stablise things considerably.

Sadly once again a bit delayed this month, but I do promise to improve in the coming months.

So last month saw the London Perl Workshop take place. Several talks related to testing, and plenty of interest in CPAN Testers. Mark Keating and crew were videoing many of the talks. Sadly the microphone position meant the volume is quite low in places, but they are still worth watching. You can see testing talks from myself (slides), DrForr, and additionally my lightning talk (slides). If you missed the event, there are plenty of great talks you can now catch on the Shadowcat channel.

Also during last month there have been several fixes to the Reports Builder. Also following some conversations at LPW, I came away with a few ideas to improve the performance of the Feed and Builder components so that we can get almost real time production of reports. Over the coming months and likely during the 2015 QA Hackathon, I aim to work more on this and hopoefully have a solution working around the hackathon. Karen Etheridge also help to spot a fault with the Admin site, in that if a tester hasn't allocated a contact email address, any report that has been flagged isn't allocated to them, but to the Admin user. It does mean there is a bit of extra work for me at the moment, but ultimately I need to rework the way testers are notified, particularly in the event of the system using an old email address. If you've flagged any reports and haven't seen them removed as yet, please give me a nudge and I'll approve their removal.

One person I would to single out for making a great effort to promote testing in general, is Sinan Unur, who has been writing some great blog posts. TDD is all well and good, but who's testing the tests?, You've gotta quotemeta! and Tests should not fail due to EOL differences across platforms to mention a few. He has some great in depth Perl posts, and all are well worth reading.

A colleague of mine, Matt Wheatley, recently had a blog post published about how he works in development. Except the post isn't so much about how he works, but more about the value of a QA Department. Our QA team is awesome, and we perhaps rely on them a bit too much sometimes, but I imagine there are many awesome QA teams out there, and this is just one tribute to them all. I especially liked the reference to Bill Sempf's tweat.

We had another Metabase overflow last month, which means we've filled another bucket in AWS. David Golden has created a new one, but it does mean we need to think about moving to our newer architechure sooner rather than later. David, Neil Bowers and I have been in discussions recently, and I hope we will get some movement towards this in the coming year. There will be a lot of changes needed, both to code and routing, so it won't be a simple change. However, it will be a change that will help us grow more productively in the future.

More new coming next month. And my New Year's Resolution is to get these summaries out a bit more on time!

So the 2014 QA Hackathon has drawn to a close, but it is far from the end of the work, particularly for CPAN Testers. You can read several blog posts detailing many aspects of the QA and testing community work done during the hackathon, as well as several aspects of the toolchain, including PAUSE (which saw a lot of collaboration). It does get said often, and it bears repeating, that the QA Hackathons are a vaulable part of the Perl community, and help to drive many projects. Without them it is likely that key elements of the infrastructure we have come to rely on (PAUSE, CPAN, BACKPAN, MetaCPAN, CPAN Testers, CPANTS) would be a long way from being the resourceful, stable and continually developing components we have come to accept. Next year's hackathon will be in Berlin, and I for one am very much looking forward to it.

Aside from the work on the database for CPAN Testers during the hackathon, I did get to make a few releases, but several elements I started during the hackathon, were only completed during the following weeks. One of these CPAN-Testers-WWW-Reports-Query-Report, now enables you to retrieve the Metabase Fact object, JSON or hash representation of a specific report. For those looking at analysing the similarities and differencies between reports, this may make things a little easier, particularly when we start introducing more element facts into the report fact. Currently this only works for reports stored in the Metabase, so those early reports are not currently retrievable .. yet.

I discussed a new command line tool to submit reports with H.Merijn "Tux" Brand, who is keen to run reports in standalone instances. I see this working similar to Garu's cpanm-reporter, and with the common client that Garu has been working on, this could be a nice addition to the submission options. Liz Mattijsen, Tobias Leich (who easily has the coolest gravatar on CPAN) and I talked about how Perl6 distributions could be incorporated into CPAN Testers. There are some subtle differences, but there are also many common factors too. It was interesting to read the Rakudo blog post about Perl 6 and CPAN, as overcoming some of the hurdles potentially facing Perl6 developers are likely to help make CPAN a better place for all of us. The currently proposed solution is a similar approach to how different namespaces are stored in the Metabase. For the time being though, Perl6 distributions are excluded from CPAN Testers, but once we have a Perl6 smoker there is no reason not to include them. I'm not sure how soon that will be, but watch this space.

Andreas König and I once again looked at the way the reports are stored in the Metabase. Andreas had already highlighted the updated date, which is meant to be the date it entered the Metabase, was in actual fact the created date (the date of the testers platform). Along with David Golden, we looked at the code used by the Metabase, but failed to find anything wrong with it. It's hopefully something we can take more time over in the future, however the next priority for the Metabase is getting it moved onto MongoDB and away from SimpleDB. In the meantime, the Generation code, due to time constraints and the lack thereof, has been running using the 2010 version of the Metabase/AWS interface. During the hackathon and the following weeks, I finally upgraded to use the revamped version released in 2012. Although still troublesome to find all the reports, the search interface has been much improved, and now we have a much more reliable feed from the Metabase. This is also in part due to the rewrite of the internals of the Generation code to be more pro-active in find unusual gaps between reports.

I spoke with Neil Bowers during the hackthon too. Neil had suggested some ideas that I'd wanted to included into the CPAN Testers Statistics site. We discussed others during the hackathon, and although I have several notes to work from, it will be a little while yet before I can put aside some time to implement them. Neil has no end to the ideas to help improve CPAN, and I hope he'll be a good sounding board for ideas to incorporate into the site in the future. On the first day of the hackathon he posted about how quickly CPAN Testers test a distribution, after being uploaded to PAUSE. He was surprised to see some reports posted almost instantly. This is largely thanks to Andreas' work to bring the Tier 1 mirrors up to date within 10 seconds of a distribution being successfully uploaded to PAUSE. People such as Chris and Andreas use their own T1 mirrors to feed into their smokers, so as a consequence the reports can appear on the Reports site within one hour of the distribution hitting PAUSE!

I had intended to launch the CPAN Testers Admin site during the hackathon, but didn't get the chance to prepare a launch statement. I did wonder how a launch on 1st April would go down, but elected to wait a little longer. So expect some news on that front very soon.

Just before the hackathon we reach 40 million report submissions. An impressive number by any standards, and gives a compelling argument that Perl & CPAN are still being actively maintained and developed.

John Scoles made a post related to CPAN Testers, highlighting how CPAN Testers showed why a particular test needed a little more thought. Thanks to the wide variety of perls and platforms tested by the CPAN Testers, it provided John with some intriguing failures, and made for an informative look at how hashes called in scalar context are handled differently between versions of Perl and platforms. Prior to the QA Hackathon Neil Bowers posted about his new idea for An author's CPAN dashboard, which he continued to develop during the hackathon too. We even discussed, together with Olaf Alders, whether this was something that should feature in MetaCPAN. For now it won't, but hopefully it can develop further and w'll see what people make of it, once more APIs are fed into the dashboard.

Another month wrapped up, and another QA Hackathon over. Lots to do and plenty of fresh ideas coming. Happy testing.

The CPAN Testers Admin site has been expanding its user test base, and had some further feedback. Expect an official launch very soon.

In blog news, Randy J Ray has posted two articles relating to CPAN Testers. In his first, CPAN Testers and RPC-XML: Well, Crap, Randy highlights a problem authors often see with CPAN Testers reports, where the distribution tests fine in environments they have access to, but fails in tester environments. It's one of the reasons why we have CPAN Testers, because there are thousands of environments we can't reasonably test before we make a release. In his follow up, CPAN Testers Follow-Up: Progress Is Made, Randy explained how he was able to eventually recreate the failures, using perlbrew, and found part of the cause to be a pre-requisite module and the way in which he had used it. It often takes a bit of effort to track down problems like this, and can be frustrating to discover it's not necessarily a problem with your code. However, it's great that Randy has documented the processes he followed and highlighted the changes he was able to make. If you've had similar experiences diagnosing problems following receipt of a tester report, it would be great to see a blog post about it.

Gabor Szabo recent looked at the number of people visiting CPAN/Perl sites. In his post, CPAN - Number of visits, it highlights there are roughly around 1 million visitors to the two main CPAN search sites each month. CPAN Testers Reports gets considerably less, around 6,000 visits (4,000 visitors) per month, but then we are more of a niche side to CPAN, that is mostly used by authors. We also push results out to users, or sites pull details from us (both search.cpan and MetaCPAN included). It'll be interesting to see whether the CPAN Testers Search site attracts more or less users to the site.

Neil Bowers released CPAN::Testers::Reports::Counts, which builds on the CPAN Testers Statistics site, and provides a slightly different view of the statistics of CPAN Testers. At the moment the distribution looks at the overall submissions, and doesn't drill down to specific distributions or releases, but I'm hoping to work with Neil this coming week at the QA Hackathon in Lyon to see whether we can produce something of that nature, similar to his CPAN::ReleaseHistory module, which you can read about in Neil's What's your CPAN release history? post.

Speaking of the QA Hackathon, this will be happening from 13th-16th March. The list of projects planned is quite impressive and if we get through only half of them, it will be a great achievement. There are plenty of CPAN Testers related plans I have, and hope to speak with several people over the 4 days to see what else we can plan for the rest of the year. One of the great things about the hackathon is a chance for many key individuals to get together and discuss projects face to face. Discussions that can take months via email, take minutes in person and we end up resolving complex problems and finding solutions simply by bouncing ideas off each other. Although the QA Hackathon attendance is sorted now, there is no reason why people cannot help remotely, and even organise satellite hackathons. Join the regular IRC channels when you can, and see what you can help with.

Lastly, I'd like to mention Gittip. There have already been several posts about it, and I think Ovid and David Golden have probably said all you need to know about why you should join. I join after reading Ovid's post, and aside from the Perl community, I have also created the CPAN Testers community. I expect someone may want to create a CPAN Authors community too. If you are a tester and on Gittip, please add yourself to the CPAN Testers community. Even if you're not into giving or receiving, as Ovid notes, just being there helps to increase Perl's profile, and if the same can be said of the CPAN Testers' profile so much the better. If you want to contribute to CPAN Testers more directly, there is always the CPAN Testers Fund. We are always delighted to receive financial donations as it all helps to keep the CPAN Testers service going for many more years.

First off this month, some progress on the CPAN Testers Admin site. My thanks to David Golden and Andreas König, as they spotted some initial problems with the mailing side of things, which i hadn't noticed. It led me to spot another issue with the way additional addresses were confirm and associated with a single account. In order to get more eyes on the site, I will be opening the net slightly wider this month. Hopefully if there are no further problems or bugs spotted, it will get a more formal launch soon.

Neil Bowers recently contacted me about some of the APIs to the CPAN Testers, as he wanted to build on his CPAN Report 2013, with the aim of using CPAN Testers in a future report. In addition, he was also looking at ways to improve his Adoption list. As a result he has released CPAN::Testers::Reports::Counts to CPAN. The module uses the raw data files of the CPAN Testers Statistics site, which allows the module to filter based on month and/or year. The distribution made me think about how better to provide APIs to the CPAN Testers data, for anyone who wanted to play with the numbers like this. I hope to work on some of these at the QA Hackathon next month.

Speaking of the QA Hackthon, this year's event will be in Lyon. Although attendee spaces are pretty much filled now, if you have suggestions for projects people can work on, or are able to create a remote event (like the Tokyo satellite event last year), or will have time during the event to work on projects from home, please get involved. There are plenty of projects already planned to be worked on, but as many key people will be in one room together, it is an ideal opportunity to discuss specifics, or move designs on, more than we would over email. For the past 6 years the QA Hackathon has helped to output a vast vareity of projects, code and ideas, and I'm sure this year will be no different.

Although not specifically related to CPAN Testers, I did want to mention something that Neil Bowers has been looking at. Over the years there have been many discussions about modules/distributions on CPAN that bear no merit and/or have long since been abandoned. In some cases the code in question may only need someone new to take pity on it and breathe new life into it via the Adoption list. However, several are half formed ideas, have been deprecated for other better distributions or were never used as they were too badly broken. At the end of last year, Neil posted about his plans to review these older modules and distributions and see whether they are worth deleting from CPAN. They wouldn't be lost forever, as they would still reside on BACKPAN, but it would help to make CPAN a slightly leaner and even more relevant repository of code. Although caution has been urged, which Neil was already planning, the idea has generally been well received. The modules Neil has started looking at are from 1996, and have not seen a further release in the last 17 years. His first candidate, Win32::FUtils, has now been removed. You can read more in his Curating CPAN one dist at a time post.

Every so often, Gabor Szabo has been looking at popularity of Perl sites compared to other languages. In a post about the changes throughout 2013, Gabor notes there has been some improvements relating to the Alexa rankings. While not a definitive indication of popularity, it is encouraging to see most of the sites having much improved rankings from the previous year. I was particularly amused to see cpantesters.org jump from 553,281 to 265,792, which I think has been due to me allowing the various web crawlers and spiders back onto the CPAN Testers Reports site. Hopefully continued promotion of Perl and associated projects will help keep these Perl sites and others moving up the rankings. I look forward to seeing Gabor's post next year.