CPAN Testers is only made possible with the support of our sponsors.
For more information on sponsoring, please visit the I CPAN Testers website.

Upgrade Notice

The CPAN Testers Blog site has been upgraded since you last accessed the site. Please press the F5 key or CTRL-R to refresh your browser cache to use the latest javascript and CSS files.

News & Views

Once again apologies for the delayed summary this month. The day job has been rather taken up a lot of my time for the past few months, and I've had a few other projects that have needed my attention. Hopefully next month, I'll fair better.

First off for this month, I'd like to advertise the 2013 QA Hackathan, taking place in Lancaster, UK from 12th April to 14th April. This event with be the 6th QA Hackathon, and is looking to as successful as previous years. There are plenty of projects to work on, and plenty of developers willing to pitch in and help out. Plans for CPAN Testers include preparing for the move from AWS. We have an opportunity to clean up the problems of data storage, or rather data search, by moving to a new Metabase, which Dave Golden has been working on. There will plenty of other QA projects that will be worth watching too, so look out for the various blog posts and code releases during and after the event.

David Oswald ended January on the mailing list, with a question about tests for Bytes::Random::Secure on 5.6.*. It highlighted the usefulness of CPAN Dependencies, providing the data to show that pre-requisites were the problem, rather than the distribution itself. It also highlighted why referencing cp5.6.2an at cpxxxan.barnyard.co.uk, could help out those users who need reliable installs of particular distributions for their version of Perl.

The discussion regarding NFS continued into the month, as detailed in my delay summary from last month. Buddy Burden asked about comparing test reports. Although not quite the same, it does follow on from other similar requests about getting at data within reports. There are two problems at the moment, the first is reliably getting at the metadata of a particular report. Although I could open up an API to this data, it doesn't help with the second problem, which is the ability to compare structured data. Currently reports are compiled mostly as a single piece of text. To be able to properly compare reports, it would be better to use structured data. At the last QA Hackathon making this more of a reality was discussed and some code was even released to help push it forward. However, there is still some way to go, and hopefully at the 2013 QA Hackathan we may see some more movement on this. In the meantime, the CPAN Test Analysis site may well provide some of the comparisions you may be looking for.

Matthew Musgrove asked if he could use the same distroprefs with multiple smokers. For those unfamilar with the distroprefs files, these are the files used by testers to help filter out distributions that are problematic when running automated tests. They are also referred to as the ignore lists. These files typically are used with individual smokers and are not shared between them. However, there are a few ways this could be handled. Some use a source code repository (sometimes on GitHub), to sync between their smokers. David Cantrell told us he uses the more traditional method of shell script and rsync. However, you share your distroprefs between smokers, bare in mind that you'll be excluding distributions that may not be a problem on some of your smokers. As such, it may be worth retrying some distributions over time to see whether old issues have been resolved.

David Oswald top and tailed the month with another post asking for help understanding a FAIL for Bytes::Random::Secure. David had identified that the failure was due to Crypt::Random::Seed not being installed, however it was explicitly listed in the prerequisites. David Cantrell pointed to the very long @INC, which some smokers use to avoid installing distributions, but reference the installers build directory. This can be problematic for two reasons. Firstly, long running smokers can blow the length of @INC, such that paths added to the end only get ignored. Secondly, if the installer has a limit for the amount disk space used, it may remove older distributions before running tests. To avoid this, many smokers will automatically stop and start their smokers to keep the @INC at a manageable length.

We passed 28 million reports at the beginning of February (congratulations to Peter John Acklam for his PASS submission of Compress-Zlib-Perl), and passed 29 million (even more congrats to Peter John Acklam for his UNKNOWN submission of Regexp-Assemble) at the beginning of March. I suspect we may well hit 30 million around the time of the 2013 QA Hackathan, which would be rather nice to celebrate while many of us are all together. In February, Thomas Peters (WETERS) held the 6000th unique PAUSE account to upload a distribution. Interesting to see that the number of authors currently submitting to CPAN has stayed pretty constant over the past few years. We're also gaining around 30-40 new active authors every month too.

That's all for now. Hope to see you in Lancaster next month, and I'll do my best to get the next summary out before then too!

MDK is the man! A big thank you to Mark Keating for a great post about how to donate to the CPAN Testers Fund. I have received some feedback about how to make the fund and the donation process a little more prominent on the websites, and I plan to address that in the coming months. I have also been very encouraged by some of the feedback, and hopefully we shall see more donations and sponsorship in the coming years too. Aside from asking your company if they can donate, or writing on your blog about how CPAN Testers have helped you, if you're so inclined, you might want to add a note to your README or POD to tell users how they can donate. There are plenty of other funds you might want to advertise too, so don't feel restricted to the CPAN Testers Fund. CPAN Testers can also benefit indirectly from the other funds, so its all good.

A post I forgot to mention last month was the news that CPAN Dependencies now accommodates META.json files in distributions. With the move to use META.json files for version 2 of the Meta Specification, a number of distributions have started adding the file. Some have included both a META.json and a META.yml, but some are now only releasing distributions with a META.json. This change to CPAN Dependencies now means that these distributions can once again be included in the calculations. Thanks to Dave Cantrell for the update.

Another post I should have mentioned last month, but only became aware of it after the post, was from Vyacheslav Matjukhin. Although his post wasn't particularly related to CPAN Testers, it does make an interesting discussion point regarding how automated testing isn't the only avenue for testing. Automated tests provided in your distribution are only part of the whole picture. Documentation is important and we can test for it being correctly formatted, but we can't test whether its understandable. We can't also test whether the usage and results of the code make sense to another user. You can write tests that make sure you understand what you expect, but if a user of your distribution struggles to understand the arguments and results, it doesn't help them. Without a doubt Validation::Class has benefited from the feedback, and while the author and reviewer might not agree on all points, it was a very worthwhile exercise. Not always easy to get such feedback, but if you can it is well worth taking advantage of it.

On the mailing list Nigel Horne highlighted to me an issue with the Leaderboard. His numbers didn't appear to be adding up correctly. This was in part due to the trawl through the Metabase to find the missing reports Amazon had failed to send through, but also due to an issue with the way the leaderboard is generated. The code has now been rewritten, and a new database table created, to better manage all the counts. As of now, the leaderboard numbers should be much more accurate, and be incremented as expected. During this process I took the opportunity to map some addresses, and got through 105 mappings, of which 44 were brand new names to the system.

Next month sees YAPC::Europe 2012 taking place in Frankfurt. There are a few testing related talks planned, but I'll cover those in next month's summary. The deadline for talk submissions is 15th July, so there are still a few days to complete your submissions before the deadline. Hope to see you there.

Posted by Barbie
on 12th March 2012

Due to the vastness that are CPAN Testers reports, David Cantrell has had to move the CPAN Dependencies site to a new server. In addition the cpXXXan service that David provides has also moved.

Both CPAN and CPAN Testers have grown at such a rate that providing these services in virtual hosts is no longer manageable. As such both have now move to their own dedicated servers. While there should be no visible difference to the sites, and all domains shoiuld resolve to the new servers, please David know if you experience any problems at all.

Thanks as always to David for providing these services to the Perl community.

Posted by Barbie
on 8th February 2010

Not long after publishing the summary for last month, and we had a couple of post of progress with parts of the CPAN Testers eco-system.

The first from David Golden, was his 'CPAN Testers 2.0 end-January update', detailing further work on the Metabase.

The second set of progress updates was from David Cantrell, with 'More CPANdeps jibber-jabber' and 'Even more CPANdeps stuff' explaining some of the changes within the CPAN Testers Dependencies site.

Last month marked quite a momentous occasion, as David Cantrell submitted the one millionth report. I think Chris was after that accolade, but he'll just have to make do with being the highest test report submitter :) Thanks to the increased interest in bulk testing from our current top testers, we reach this point much sooner than I would have expected a few years ago. Thanks to these guys a large portion of CPAN has already been tested on 5.10, and we now have many reports across nearly all versions of Perl 5, certainly those known to be available in production environments.

Slaven Rezić's CPAN Testers Matrix is now accessible from each CPAN distribution page, as is David Cantrell's CPAN Dependencies site, although with the expected increase in traffic, Dave is currently moving the site to a new box. Using the database generated for the stats site, Andy Armstrong has created a slightly different presentation of the success/failure graph that's on the stats site. I currently create all the graphs statically, which is fine when I'm updating once a month. However, if I get the site updated more frequently, then it would be useful to enable you to click on the report you would like to see and create more professional looking graph/chart. I may well investigate more of manyeyes later.

Interest in CPAN Testing is increasing, even if some authors still appear to be completely unaware of who and what CPAN Testers are. One thing for anyone confused by these reports, they are generated by computer, and are often unattended. As such instructions in the README file don't get read. If you have a prerequisite that is outside of the MakeMaker or Module::Build prerequisite framework, try and detect it using Devel-CheckLib. The Notes For CPAN Authors wiki page was specifically written for authors who need pointers to enable them to achieve a clean install. Although it should be noted, that while CPAN Testers aren't necessarily average users, they are trying to recreate environments that a brand new user would likely experience. The reports aren't there to harangue or berate you, they are trying to highlight problems that occurred during the build and testing processes. In some cases this may highlight problems in the automated test environment, which testers try very hard to resolve as soon as possible, but in the bigger picture we are making CPAN a more reliable and enviable code repository.

17 more addresses mappings, including 8 new testers. Welcome aboard folks.