CPAN Testers is only made possible with the support of our sponsors.
For more information on sponsoring, please visit the I CPAN Testers website.

Upgrade Notice

The CPAN Testers Blog site has been upgraded since you last accessed the site. Please press the F5 key or CTRL-R to refresh your browser cache to use the latest javascript and CSS files.

News & Views

So last month saw the London Perl Workshop take place. Several talks related to testing, and plenty of interest in CPAN Testers. Mark Keating and crew were videoing many of the talks. Sadly the microphone position meant the volume is quite low in places, but they are still worth watching. You can see testing talks from myself (slides), DrForr, and additionally my lightning talk (slides). If you missed the event, there are plenty of great talks you can now catch on the Shadowcat channel.

Also during last month there have been several fixes to the Reports Builder. Also following some conversations at LPW, I came away with a few ideas to improve the performance of the Feed and Builder components so that we can get almost real time production of reports. Over the coming months and likely during the 2015 QA Hackathon, I aim to work more on this and hopoefully have a solution working around the hackathon. Karen Etheridge also help to spot a fault with the Admin site, in that if a tester hasn't allocated a contact email address, any report that has been flagged isn't allocated to them, but to the Admin user. It does mean there is a bit of extra work for me at the moment, but ultimately I need to rework the way testers are notified, particularly in the event of the system using an old email address. If you've flagged any reports and haven't seen them removed as yet, please give me a nudge and I'll approve their removal.

One person I would to single out for making a great effort to promote testing in general, is Sinan Unur, who has been writing some great blog posts. TDD is all well and good, but who's testing the tests?, You've gotta quotemeta! and Tests should not fail due to EOL differences across platforms to mention a few. He has some great in depth Perl posts, and all are well worth reading.

A colleague of mine, Matt Wheatley, recently had a blog post published about how he works in development. Except the post isn't so much about how he works, but more about the value of a QA Department. Our QA team is awesome, and we perhaps rely on them a bit too much sometimes, but I imagine there are many awesome QA teams out there, and this is just one tribute to them all. I especially liked the reference to Bill Sempf's tweat.

We had another Metabase overflow last month, which means we've filled another bucket in AWS. David Golden has created a new one, but it does mean we need to think about moving to our newer architechure sooner rather than later. David, Neil Bowers and I have been in discussions recently, and I hope we will get some movement towards this in the coming year. There will be a lot of changes needed, both to code and routing, so it won't be a simple change. However, it will be a change that will help us grow more productively in the future.

More new coming next month. And my New Year's Resolution is to get these summaries out a bit more on time!

A belated update this month. December proved a busy month preparing the new CPAN Testers Admin site. The site is still being tested, but its full launch will be coming soon. A new testers database is now in place, and the updates to the CPAN Testers Statistics site have been made much more dynamic.

A further update to the Statistics site is thanks to a suggestion from David Golden. On the Statistics of CPAN you can now see the current size of both CPAN and BACKPAN. Current sizes are 17GB and 34GB respectively. I've also added the current size of the static reports site too, 51GB as of now. Meanwhile the complete set of databases comprise 621GB of disk space. Thankfully we still have 1.6TB left on the hard disk, so plenty of remove to move.

My thanks to Andreas König over the past month, as he has been able to monitor performance of the databases, and highlighted a few issues. Since moving to the new server, I'm still having to make a few tweaks to the configs. Please let me know if you notice any part of the websites that appear to have stalled. Sadly the stalls have mostly been due to the way SimpleDB returns its data. As I mentioned in my London Perl Workshop talk, I am not a fan.

Speaking of London Perl Workshop, it was great to meet up with folk again, including Andreas König and Neil Bowers. I also had some discussions with a few people about the proposed search site, which Ben Bullock will hopefully looking at this year. My talk, The Future of CPAN Testers, seemed to go down well, and hopefully we'll keep you posted on the progress of the on going projects throughout the year.

David Yingling posted an interesting blog entry, You Can't Test Everything, But At Least Test What's Important, that briefly mentions how CPAN Testers helped to kick start a series of bugfixes for his Fetchware distribution. If CPAN Testers have helped you, please blog about it, and let me know so I can feature in the summary.

Coming soon is the 2014 QA Hackathon. This will be happening in Lyon, France during March, so expect lots of testing related projects and news to be forthcoming, before, during and after the event.

Until next time, happy testing.

Last month had a bit of focus on London Perl Workshop. Aside from my own talk, there were a few other testing or CPAN related talks that were well worth attending. Neil Bowers gave us some background to his Adopt a CPAN Module list, and some of the plans to include more metrics. He also detailed some of the changes in his blog post Including CPAN Testers results in the adoption list . It was good to catch up with Neil, and several other more recently recruited CPAN Testers.

For my own talk, 'The Future of CPAN', I looked at some of the work that has been done over the past year, and took a look at the projects planned for 2014 and beyond. Some of the projects are already in progress, and some we hope to have released early in 2014. The future is looking good for CPAN Testers, and there are lots of ideas about how to make it better. There are several tweaks and improvements planned to the existing sites, and hopefully we'll have many more suggestions from you folks over the coming year. I spoke with Andreas König at LPW and we discussed promoting areas of CPAN Testers better. I'm going to look at that over the next year, so expect a slightly different focus each month, as we encourage others to get involved, or make better use of the testing resources we have.

One interesting aspect, included in my talk, was the fact that now we have the new server and are able to server the regular website from a separate disk, the response times have been dramatically improved. As a result, I've enable all the crawlers and spiders. I didn't announce it, as I wanted to see how quickly the spiders would notice, and second to see whether we could cope. The results were impressive. Google noticed almost immediately, although some of their bots have been allowed back in for a little while already, and then Microsoft noticed, followed by plenty of others. It was noticeable when a particular crawler bot spotted they could hit the site, as can be seen with the spikes in the requests. However, what is perhaps even more impressive is that the builder simply shrugged and dealt with it, resolving the requests within a few hours.

Splitting the disks for web and backend files has had such an effect, that we are now frequently less than an hour behind. The only time we aren't is when Amazon can't get their searches right. Thankfully, it looks like the code I've written to compensate for this is working well, and over the past few months the check with the tail log hasn't missed any reports. Once we move to the new Metabase servers, we should be able to rid ourselves of much of this checking code, and have even more reliability.

The biggest project over the last month for CPAN Testers, has still been the Admin site. The new testers database has been created, and the scripts are all in place updating the various tables, including the leaderboard table. The code to use the new leaderboard isn't live yet, but should be going live soon. The site is close to completion, and is undergoing some last minute testing to ensure I have everything covered. Over the Christmas period I hope to invite a few people to test the site and feedback any bugs they spot. All being well, I'd like to launch the site (finally) in the first week of January 2014.

The mailing list has been a little quite last month, although Gabor did post to promote my talk video from last year's YAPC::Europe in Frankfurt. David Wheeler asked about the RSS feeds that are posted from the reports site. As some authors/distributions can have many thousands of reports, is there a cut-off that should be employed. Perhaps the latest 1000 reports, or only those posted in the last 6 months or in the last year? Do you use the RSS feeds (PASS and No PASSes), if so what would your preference be? You can post on the list or email me, and I'll review in the new year the best limit to impose, although provisionally I'll be looking at the last years worth of reports.

The Statistics site will be getting a few extra metrics soon. David Golden suggested some metrics regarding CPAN and BACKPAN, that would be interesting to see. They're on my TODO list, and I hope to address them over the next month. If you have any ideas for more metrics, whether for CPAN or CPAN Testers, that you'd like to see on the Statistic site, please let me know.

That's all for the moment, but expect to see plenty of activity in the New Year. Happy holidays all

The server upgrade is now complete. The final pieces of the puzzle were to implement the mailers for reports. After a lot of head scratching, Robert figured out the problem was the filtering for mails coming from the new server. Many thanks to Robert and Ask for helping out here, even if Robert thinks I should be using Postfix ;) Mails are flowing now, and many thanks for everyone's patience while we got them sorted. I'm also seeing all the bouncebacks again, which means I'll need to update the preferences for those authors soon. If you've been missing your mails from cpantesters, please check the address you have set up in PAUSE. If it is old and no longer valid (including those that are hidden from public view), then the perl.org mail server is going to issue a bounceback for them. If you've changed jobs, please make sure that you update your PAUSE email if necessary too, as there is at least one person who is unwittingly sending "no longer works for this company" emails for their PAUSE email.

New SSL certificates are now in place for both the Preferences site and the forthcoming Admin site. Our last provider may have provided them for free, but the hassle in getting them, really wasn't worth it. Thankfully the new ones are now valid for 5 years at a very nice discount.

At the Birmingham Perl Mongers October Technical Meeting, I gave a presentation entitled 'The Future of CPAN Testers'. The talk went down well, and I recieved some good feedback. As such, I have submitted it for the London Perl Workshop on 30th November. The talk will cover some of the changes that have happened this year, and look at some of the projects planned for the near and far future. If you're interested in getting involved in CPAN Testers as a developer, rather than a tester, this talk will be an ideal insight.

Speaking of CPAN Testers talks, many thanks to Gabor Szabo. Firstly for setting up Perl TV, and secondly for promoting the talk I gave at YAPC::Europe 2012 in Frankfurt. 'The Eco-System of CPAN Testers' was my attempt to explain how all the different processes used by CPAN Testers all fit together, detailing the path from report creation to appearing on the Reports website and beyond. Hopefully my latest CPAN Testers talk will be a suitable follow-on companion to last year's talk.

And speaking of the future of CPAN Testers, the Admin site is steaming ahead again. After getting my head round all the email stores, a new script to populate the new tables is off and running. It'll take a while to get through the 36 million reports, but most of the codebase is now ready to go. There are some further changes to the Statistics site and the address maintenance scripts to do, but I'm hoping to have a release date in the next summary.

Hopefully some readers will be at the London Perl Workshop at the end of the month, so please say hello if you're a tester, or want to get more involved in CPAN Testers.

November turned out to be a very eventful and productive month. Aside from various code updates with some CPAN-Testers distributions, including porting many of the tests to Test::Database, and discovering the usefulness of Test::Trap for testing some of the scripts, we also got a handle on the missing reports. For the past few months, questions about missing reports has increased. Back in August I started to look at a more thorough catch-up. After some suggestions and ideas from David and Andreas, I also added some to code to collect data in a similar way to the tail log. As a consequence of the tail parsing, the improved catch-up code and the rewritten generate code, it now means that not only have we caught up, but we now have a much more robust mechanism in place to ensure we're not missing any reports.

During November, the builder was under heavy load to compile all the pages with new reports. With 3 processes running simultaneously, a high volume of request were being made, but thankfully it held its own, and is managing requests very well. The oldest request is less than 2 days old at the moment, with the latest report lag being about 15 hours behind. As such, if you don't see one of your reports or your distribution doesn't seem to be quite up to date, please wait a couple of days to make sure it's not just waiting to be processed.

Last month also saw some significant limits being pushed a little higher. First and foremost 1033056 reports were processed in a single month. Although we've had more than a million reports submitted in a single month, its never been this high before. Chris Williams became the first tester to pass 10 million report submissions, not that we doubted it was coming soon. We also now have 6 testers who have submitted over 1 million reports each, and last month saw an notable increase in the number of testers currently producing reports. Hopefully this is a trend that will continue. November also saw the biggest variety of platforms we've ever had tested too, and with 102 different platforms its the first time we've managed to cover more than 100 in a month. With all these new highs, it perhaps isn't too surprising to hear that at 67, we've also had the highest number of perl versions being used in testing too, with 63 so far this month already. And finally the number of reports ... last month we passed the 25 million (although last week we passed 26 million) reports submitted. As always many thanks to everyone who has submitted reports over the past 13 years.

On the mailing list Shmuel Fomberg began a lengthy discussion about reports grades that would highlight distributions with dependencies that failed to install. While I understand why many users might find this useful, for authors it would be a blight on their distributions, which may not be truly representative of the state of the distribution. For example, a distribution may fail with one version of a dependency, which then gets fixed, but the report remains tag against the latest distribution, even though the new latest version of the dependency works. It gives a false impression of the distribution. Without extensive metadata analysis the CPAN Testers systems wouldn't know that first report is now is not necessarily relevant. Or is it? What if the user can't install the latest version? Either way an author is tarred with a fault beyond their control, and that isn't what CPAN Testers is about. This isn't the first this discussion has come up, and I doubt it'll be the last.

JT Smith asked about the best practice for test network access. As yet there isn't a one size fits all, LWP has an is_online method, some distributions use ping, although this available on all platforms, and others test whether they get a HTTP 200 response from a known URL. If you know of a more suitable method, please add it to the wiki. Kirk Kimmel asked how long it took test the whole of CPAN, which didn't get a definitive answer, but was guess to be several days, but less than a week. Nathan Goodman questioned why he saw no output in a PASS report for hist distribution, and after some investigation, Chris Williams discovered a bug in CPANPLUS (specifically CPANPLUS::Dist::Build), proving yet again how useful CPAN Testers can be ... even with PASSing reports.

With all the improvements and great support from the community, I'm very pleased to see the CPAN Testers project is in a very healthy state again. I'd also like to thank everyone who has contributed to the CPAN Testers Fund, and particularly those who attended the London Perl Workshop at the end of last month and poured their spare cash into the fund buckets, including the CPAN Testers Fund. Projects like CPAN Testers can only survive with the volunteer contributions, both in time and cash.

Although we still can't say too much just yet, we do have some great news that the Metabase will be moving to a new home in the new year, and we'll be moving off SimpleDB, meaning we might have even better response times in future. My personal thanks to David Golden for following this up, as well as initial introductions and negotiations by Karen Pauley and Ricardo Signes. 2013 is looking very bright for CPAN Testers and long may it continue.