CPAN Testers is only made possible with the support of our sponsors.
For more information on sponsoring, please visit the I CPAN Testers website.

Upgrade Notice

The CPAN Testers Blog site has been upgraded since you last accessed the site. Please press the F5 key or CTRL-R to refresh your browser cache to use the latest javascript and CSS files.

News & Views

Posted by Barbie
on 21st June 2011

Recently Gabor Szabo has been tweeting about his climb up the CPAN Testers Leaderboard. It was something that Damian Learns Perl also picked up on his recent CPAN tester post. It was also exactly the kind of healthy competition I had in mind to encourage people to become CPAN Testers. However, is there anything more we can do to not only attract testers in the first place, but also keep them submitting reports? Do we need to raise the profile of CPAN Testers?

On the CPAN Testers Discussion mailing list, Gabor has been suggesting some ways to promote CPAN Testers, and get more regular testers involved. While I don't have any problems with encouraging anyone to get more involved with CPAN Testers, I still want to keep those occasional and even one-time testers still willing to submit as many reports as they feel comfortable with, which some have seen as a problem. Here is why I believe it's not a problem.

Firstly, occasional and one-time testers often only submit reports when they install modules for their currently working or development environment, which may well be significantly infrequent. This does not mean that their contributions are any less significant though. They are testing on real machines in real life environments, and can often run into to problems that automated testers may not experience. This is particularly true for distributions which rely on libraries not installed on automated smoker environments.

Secondly, the statistics only give a picture based on the user profile. The leaderboard doesn't make any distinction between individual testers and automated smoker networks. As a consequence, someone like Chris Williams, who has several dedicated automated smoker environments, distorts the weighting. The trend would likely be a little more even if we were able to calculate the number of reports from each unique smoker environment. Many of the testers who appear to have submitted only one report, in many cases have submitted others. Although I have put in a lot of effort to consolidate multiple email addresses into a single profile, there are many email addresses which are no longer used, and for which I am unable to pair up to an existing profile.

Thirdly the leaderboard as it stands is for the whole of CPAN Testers history. It took nearly 10 years to reach 1 million reports, while now we have over 500,000 reports submitted each month. The distribution of tester contributions is very different over the last few years than it was in the early years.

However, while it's not a problem, it would be much nicer to be able to encourage more of these low volume testers to submit more reports more frequently, as their contributions are valuable. This primarly means trying to understand why some testers have only posted a few reports and then have choosen not to continue. Is there some reason for this that we can resolve? Has the change to the HTTP submissions confused people and they've not continued?

I'd also like to make it as easy as possible to install the required software, both for infrequent testing as well as for more automated environments. To a large degree we already have this covered, with various instructions on the Wiki, although I'd be delighted if we could encourage contributions to explain more scenarios.

Gabor also asked how we can encourage companies to run smoke testing on their machines, particular during any out-of-hours downtime. I'm not sure of the best way to do this, as many companies have usage policies that prevent the use of machines for this kind of purpose. For those that do want to contribute and want to support CPAN Testers, what is the best way we can help them?

There has also been the suggestion to create logos and banners to promote CPAN Testers, which can be used both by companies and individuals on their websites and blogs. As a starting point we have the smoking onion, which I would like to keep as a brand for CPAN Testers, but if you have graphic skills, I would be very interested for anyone to create some logos and banners that build on this. I'd like to be able to create a page with a variety of images which people can then use to link back to us and show their support for CPAN Testers.

So if you have any ideas for promoting CPAN Testers, or have 133t graphics skills to create some images, please get in touch via the CPAN Testers Dicussion mailing list and share your ideas with the CPAN Testers community.

Posted by Barbie
on 20th June 2011

Following a couple of issues raised recently, I have updated the CPAN Testers Wiki and the CPAN Testers Blog sites.

Primarily the updates for the Wiki site concentrated on user management, and the ability to register as a new user. There is still a problem with sending UTF8 mail, but mail itself should be working for anyone who has forgotten their password. The most obvious change though is that we have now changed to use Gravatars for the images used in user profiles. This falls in line with many other Perl & CPAN sites, which have standardised on using the Gravatar service.

For the Blog site, although user management has been improved, the updates were mostly for image management. For regular readers, its unlikely you will notice much change, but pages should render a little quicker now.

Both sites have also seen some upgrades and fixes with regards to W3C Guidelines compliance, and the introduction of the "EU Cookie Directive". You can now see the Terms & Conditions applicable to each site via a link at the bottom of each page. A full Privacy Policy has now been created that can be used for any site, and although CPAN Testers is an entity in name only, it might provide some reassurance that we don't intend to misuse any data we hold or use it for purposes it was never meant for. We now have to list all the cookies we use, so that any users who wish to block them can do so. I won't add to the debate and confusion regarding this here, but suffice to say it will mostly be business as usual.

Some of the updates for these two sites are part of a larger initiative to meet as higher level as possible of the WAI WCAG v2.0 Guidelines. My current aim is to attain at least A Level, with as many pages as possible attaining AA Level compliance. Further updates for these two sites, as well as other CPAN Testers sites, are planned, as further improvements are applied.

File Under: blog / wiki
Posted by Barbie
on 13th June 2011

It's taken a little while, but the CPAN Testers Statistics site is now back online. Due to an unfortunate corruption of the cached data file, the file has taken some considerable time to rebuild.

With over 12 million reports to get through, the memory consumption to try and record everything all at once is quite demanding, hence why it's been building a chunk at a time.  At some point very soon, I will be revisiting the way the data is collated and stored in memory and how the statistics are calculated and processed to make it all more efficient. I also plan to make it much more easy to process the sections to enable continuation if any part fails in a single run.

One thing to note, and something that may be more noticeable with the launch of the Admin site, is that the report counter has been adjusted. I recently reported that Chris had posted the 12 millionth report, which was true at the time. However, it now reads that Slaven holds that honour. The reason is due to the bogus reports that are now hidden from the statistics. This will likely happen from time to time, although hopefully not too frequently, as we have got a lot better at catching bad smoke runs and stopping them before they get too carried away.

I'm always looking for suggestions for improvements and additions to the statistics we currently present, and we've had one from Gabor, which I hope to work into the site soon. If you have any ideas for metrics you'd like to see, please let me know and I'll see what I can do.

May has been a bit of a quiet month, although a few things have been happening in the background. The continuing work of packaging the websites is ongoing, and the plans for the Admin site are coming together. Having said that we did pass the 12 million reports mark in the middle of May too :)

We had a few reports that the testers themselves acknowledged were inaccurate. Although the new Admin site will eventually allow testers and authors to tag reports, the site isn't quite ready for release yet. As such, I now have a script, which uses the guts of the forthcoming Admin site, to hide broken reports. The reports are not deleted, as they may be of use in the future, but hidden from the statistics calculations. If your smoker does send a selection of bad reports, please let me know, and I'll see if I can hide them. As mentioned, the Admin site will eventually take on this role more officially.

Unfortunately we had a bit of a problem with backups recently, which has resulted in the one of the files used to cache the statistics, becoming corrupted. The cache file is currently being rebuilt and hopefully the Statistics site will be back online within the next few days.

The backups resulted in the hard disc filling up, as the backups now take up a noteable amount of space. As a result, the offline backup process will be taking place a little more frequently. I'm currently reviewing the server options available to us, with a view to increasing processing power and disk space to allow us to expand over the next few years. Our current server plan still has several months to run, so we have time to migrate to a new machine. We'll warn you before any switch-over, although as long as applications are using the URLs, there should be no noticable change.

This month David Golden hopes to review the current Metabase. We've had problems with Amazon, and not just because of the recent outage, but as David notes, with the number of reports submitted into the CPAN Testers database doubling in the last year, we need to look at ways to keep the processing and testing stable. With the Metabase this will involve investigating the best storage mechanism for our needs.

At the Birmingham Perl Mongers May technical meeting, another hackathon session took place. This time they looked at integrating their Devel-Platform-Info distribution into the CPAN Testers framework. The first fruits are the release of CPAN-Testers-Fact-PlatformInfo, with patches to clients and transports in progress. The work will continue in the June hackathon session, with plans to complete a write up, so that anyone wishing to add more metadata to reports can follow a guide. If your user group plans to have a hacking session on some of the CPAN Testers software, please let us know and we'll feature it in a future summary.

Conference season is fast approaching, and this month YAPC::NA 2011 takes place in Asheville, NC in the US, June 27-29. There is no specific CPAN Testers talk, but there are talks on uploading CPAN modules and testing in general. If you are planning to give a CPAN Testers talk at a conference, workshop or local technical user group, please let us know.

That's all for this month. Happy testing.

<< July 2011 (2) May 2011 (2) >>