CPAN Testers is only made possible with the support of our sponsors.
For more information on sponsoring, please visit the I CPAN Testers website.

Upgrade Notice

The CPAN Testers Blog site has been upgraded since you last accessed the site. Please press the F5 key or CTRL-R to refresh your browser cache to use the latest javascript and CSS files.

News & Views

A very late summary, and that dispite one of my aims for this year was to get better at posing these on time :( As a consequence, if anyone is willing to help out with posting summaries, please get in touch.

For this post, I want to concentrate on promotion of CPAN Testers. Mark Keating wrote a very eloquent post regarding how testers themselves can promote CPAN Testers, with his post Smoke me an Onion baby. A few years ago, BinGOs and myself toyed with the idea of getting a T-shirt sorted to give out at YAPCs, promoting CPAN Testers. Unfortunately, we couldn't think of a suitable tag line and we're not graphic artists, so we never got anything sorted. In Mark's post he came up with some wonderful graphics to be used as badges for testers to promote their own involvement. If you like Mark's creations and wish to use them on your website, please get in touch and he'll provide copies for you to use. In the meantime watch out for stickers and badges at various future events.

Over the past few months, we have some interest from a few companies that have expressed a desire to support CPAN Testers. Both via monetary funding and spare capacity on servers. We are still in discussion with these companies, so hopefully we will have some new sponsors in the near future. We are always will to have more companies support us, as alhough individual funding is very much appreciated, we would like to see more corporate sponsorship to help raise our profile. If your company uses Perl, and makes use of CPAN Testers either directly or inderectly, and would be willing to help support the long term future of CPAN Testers, please get in touch.

Individual funding, as mentioned, is always very much appreciated, and if you would like to contribute in any way, you can make donations via the CPAN Testers Fund, or Gratipay. All proceeds are managed for us by the Enlightened Perl Organisation, who are a not-fot-profit organisation.

The reason for promotion, and requests for sponsorship, has come about as the Birmingham Perl Mongers' funding of the CPAN Testers servers comes to an end in September 2015, when we'll need to find other sources of funding for the current server. On top of this we have the Metabase server, which also requires funding. Sadly dedicated servers don't come cheap these days, so it roughly costs around £4200 (US$6280 or €5800) per year. Over the next six months we'll be doing as much as we can to raise the funding, but if you can help promote this to your employers that would be wonderful.

If you have any ideas to help promote CPAN Testers in other ways, please let me know and we'll see what we can do.

September featured two hot-topic discussions regarding CPAN Testers. 

The first was started by brian d foy, in a post about the effects of the development version of Test::More, which due to ongoing comment issues, were followed up separately. I see both sides of the argument, but I tend to side with brian. Testers need to take some responsibility when they are testing, particularly when it means testing releases, which may be experimental. However, it's tricky as there is no guarantee way of spotting the dependecy that is at fault. It could be the development release, but equally it could be another regular release. In fact this question has cropped up before where no development release was involved. To understand what is at fault requires human intervention. Programmatically we could try and guess, but it is likely we would get it wrong, just as much as we do now. As such I am looking at changing the Admin site to allow authors and testers to reallocate reports to another distribution. 

During the discussion ribasushi suggested expanding the report grades to handle these sorts of scenario. However, this doesn't get away from the fact that reports are attributed to the wrong distribution. It also confuses the issue for users and authors. The current system is very defined. It may be wrong on occasions, but that's something we can look at fixing. Expanding the grades is more likely going to detract the value of CPAN Testers. If we can improve the current analysis this is a much better result for all.

The second discussion, which started from a thread from August, was prompted by Philippe 'BooK' Bruhat around providing CPAN Testers machines available to test distributions on less popular OSes, so that we can ensure a decent amount of coverage for these OSes. The problem is that the CPAN Testers Fund doesn't currently have the funds available to support this. Its a great idea, and possibly something we can think about for the future, should funds ever be sufficient, but not right now.

If you have some free cycles on a less popular OS, and are willing to run a CPAN Testers client, please do so. As mentioned in the last post, there are several OSes we would love to see getting more test reports attributed to them. The OSes themselves are still widely used, and if you are having any problems setting up a client, please join the mailing list and ask for help and advice.

It was quite timely for BooK to mention the CPAN Testers Fund, as this year is the last year Birmingham Perl Mongers can fund the CPAN Testers server. After 8 years Birmingham Perl Mongers no longer have funds too cover the yearly costs of the reports server, and we will be completely reliant on the CPAN Testers Fund or gracious support from hosting companies to continue the service after September 2015.

I would like to take this moment to say thank you to the Birmingham Perl Mongers for the allowing me to support CPAN Testers for this long with their donations. Without their support, CPAN Testers would not have grown into the dynamic resource it is today. Thank you.

However, what happens next? As it currently stands the CPAN Testers will not be able to pay for the server next year. We could try and raise funds via crowdfunding sources, such as gratipay, but this very time consuming and realistically longer term we really need the support of major companies. Does your company use Perl, CPAN and/or CPAN Testers? Would they be willing to fund in part or full, the servers used to run the systems? If so, please get in touch. If you are a hosting company and would be willing to provide a server of our current spec or better, please get in touch and I can give you the specs for the current server and/or current costs to maintain the existing servers.

It would be a great shame to lose CPAN Testers, so if your company could help, please speak to your marketing department, owners or directors to see whether they are willing to donate. As a thank you, we do provide logos and links on all the primary CPAN Testers sites on the main servers, back to our sponsors websites and they will get thank yous in press releases on social media. Please get in touch if you have any questions or suggestions and hopefully we can keep CPAN Testers alive for many more years to come.

Apologies for posting this late, my laptop has sadly blown up, and I've had to rewrite my original post. More news next month.

Happy CPAN Day. If you weren't aware last month, Neil Bowers took on a promotion of the first CPAN Day. A day to celebrate the birth of CPAN. Although the concept of CPAN started before 16th August 1995, that was the day that the first true upload occurred. As such, Neil spent several days writing blog posts to help identify ways to improve your distributions. While the module code within might be great, it is just as important to write clear documentation and make the most of your package files, particularly the META files, so that PAUSE, METACPAN and others can link to all the right places.

The day itself was a success in terms of numbers, as prior to the event, working with Neil, I created a new CPAN Uploads page, on the Statistics site. This helped everyone monitor the days events in almost realtime. Its proved so successful, the page has continued. With significant thanks to INGY, we blew away the previous score of 150 for releases in a single day with 775. We also had 107 different authors submitting releases, with 41 brand new distributions finding their way on to CPAN. It was quite an achievement, and one that might take a while to pass, but as Neil noted after the event, perhaps next year we should think about something other than numbers. I'm not sure what that might be, but hopefully we'll think of something suitable to celebrate 20 years of CPAN!

During CPAN Day, unsurprisingly, there were quite a number of CPAN Testers reports (37764 in 1 day, which although 10k higher than the average, wasn't the highest we've seen in a day, that was 47929 back in January). One aspect of which I wanted to see was, how many different OSes were being tested? The subject also cropped on the mailing list, thanks to Dean Hamstead, as to which OSes are undertested, which we would really love to see more reports for. Of the testing that has been done over the last few months, the following popular OSes could use a bit more testing. So if you have a spare box and are looking to get involved with CPAN Testers, one of the following would be very gratefully appreciated:

  • Solaris
  • Irix
  • Windows (Cygwin)
  • NetBSD
  • FreeBSD
  • OpenBSD

Solaris is an often requested platform, but as most earlier versions are only supported on SPARCs, it is a tricky one to test with. However, later versions do support the x86 systems. Irix, although no longer supported by Silicon Graphics, is still used, so if you're able to set a test environment on with Irix, there are several authors who would be very grateful. The other 4 are a little more popular and easier to install in a test environment, but they are just as much in need of testing as other higher profile platforms.

The thread started by Dean also raised up the point by BooK, about whether the CPAN Testers Fund could be used to rent boxes. Unfortunately, the Fund itself isn't substantial enough to do this, as funding the existing main servers is already quite expensive. If, as Mark Keating has suggest on many occasions, several major companies using Perl were to contribute a regular, even a small, amount to the Fund each month, we could think about renting some servers or VMs to regularly test otherwise less tested systems. We already have a gittip/gratipay account, and there has been the suggestion of creating an Amazon Smile account too. These are all great ways for individuals to contribute, but realistically for us to grow it does need more major help from bigger companies. If your company can help financially in any way, please suggest the CPAN Testers Fund to them.

Gabor Szabo asked about why the Perl special variables list also includes a reference to the MSWin32 variables. The reason is simple, its how David Golden wrote it :) Longer term this will be a little easier to manage, and will hopefully be a little clearer, once Garu implements the common reporting client. Consolidating the reports with consistent data about the platform, perl and installed modules is a goal so that test reports can be better analysed, and to more easily see differences between installers as well as other environment criteria.

On the London.pm mailing list, Tom Hukins took the time to explain the differences between CPANTS and CPAN Testers. Frequently these two projects are often thought of as the same thing, and I'm not sure how tomake any clearer that they are very different projects. Paul Johnson's Devel Coverage Reports beta site is gaining fans, and I hope that it too is not confused with CPANTS and CPAN Testers too. As again it's a very different project. One that compliments the other two, but still has a different goal.

I shall be attending various HackDay, Workshop and Perl Monger tech meet events in the coming months, and hope to promote CPAN Testers in some fashion. If you're planning talks involving testing at future events, please get in touch and I'll promote them here too. Happy testing.

Posted by Barbie
on 26th May 2014

If you haven't already been aware of Gittip, it started as way to proviably tip your hat to those whose work you admire, with the idea of buying someone a beer to say thank you to some cool people for their efforts. It's also been another way to generate funds for Open Source projects.

It was suggested a while ago that CPAN Testers should get on the site with a CPAN Testers Team identity, to help add funds to the CPAN Testers Fund. While our main funding is through donations to the CPAN Testers Fund, managed on our behalf by the Enlightened Perl Organisation, having other ways for people to contribute, no matter how big or small, is certainly something worth considering. At the moment Gittip is proving to be a popular way to say thank you, so it makes sense to have a profile.

One of the problems that first faced us was getting a a suitable account from one of the other code/social sites that Gittip uses to enable anyone to create an account. We overcame that recently, when Daisuke Murase kindly gave us the keys to the @cpantesters account on Twitter. Although many projects seem to just have communities, the CPAN Testers Team account will hopefully give a more direct benefit to the CPAN Testers project, rather than individual members of the CPAN Testers community. However, if you want to thank specific testers, that's cool too.

David Golden and myself are the CPAN Testers Team account managers, so we'll make sure all contributions go into the CPAN Testers Fund. In the meantime, feel free to add yourself to the CPAN Testers Community too.

The CPAN Testers Admin site has been expanding its user test base, and had some further feedback. Expect an official launch very soon.

In blog news, Randy J Ray has posted two articles relating to CPAN Testers. In his first, CPAN Testers and RPC-XML: Well, Crap, Randy highlights a problem authors often see with CPAN Testers reports, where the distribution tests fine in environments they have access to, but fails in tester environments. It's one of the reasons why we have CPAN Testers, because there are thousands of environments we can't reasonably test before we make a release. In his follow up, CPAN Testers Follow-Up: Progress Is Made, Randy explained how he was able to eventually recreate the failures, using perlbrew, and found part of the cause to be a pre-requisite module and the way in which he had used it. It often takes a bit of effort to track down problems like this, and can be frustrating to discover it's not necessarily a problem with your code. However, it's great that Randy has documented the processes he followed and highlighted the changes he was able to make. If you've had similar experiences diagnosing problems following receipt of a tester report, it would be great to see a blog post about it.

Gabor Szabo recent looked at the number of people visiting CPAN/Perl sites. In his post, CPAN - Number of visits, it highlights there are roughly around 1 million visitors to the two main CPAN search sites each month. CPAN Testers Reports gets considerably less, around 6,000 visits (4,000 visitors) per month, but then we are more of a niche side to CPAN, that is mostly used by authors. We also push results out to users, or sites pull details from us (both search.cpan and MetaCPAN included). It'll be interesting to see whether the CPAN Testers Search site attracts more or less users to the site.

Neil Bowers released CPAN::Testers::Reports::Counts, which builds on the CPAN Testers Statistics site, and provides a slightly different view of the statistics of CPAN Testers. At the moment the distribution looks at the overall submissions, and doesn't drill down to specific distributions or releases, but I'm hoping to work with Neil this coming week at the QA Hackathon in Lyon to see whether we can produce something of that nature, similar to his CPAN::ReleaseHistory module, which you can read about in Neil's What's your CPAN release history? post.

Speaking of the QA Hackathon, this will be happening from 13th-16th March. The list of projects planned is quite impressive and if we get through only half of them, it will be a great achievement. There are plenty of CPAN Testers related plans I have, and hope to speak with several people over the 4 days to see what else we can plan for the rest of the year. One of the great things about the hackathon is a chance for many key individuals to get together and discuss projects face to face. Discussions that can take months via email, take minutes in person and we end up resolving complex problems and finding solutions simply by bouncing ideas off each other. Although the QA Hackathon attendance is sorted now, there is no reason why people cannot help remotely, and even organise satellite hackathons. Join the regular IRC channels when you can, and see what you can help with.

Lastly, I'd like to mention Gittip. There have already been several posts about it, and I think Ovid and David Golden have probably said all you need to know about why you should join. I join after reading Ovid's post, and aside from the Perl community, I have also created the CPAN Testers community. I expect someone may want to create a CPAN Authors community too. If you are a tester and on Gittip, please add yourself to the CPAN Testers community. Even if you're not into giving or receiving, as Ovid notes, just being there helps to increase Perl's profile, and if the same can be said of the CPAN Testers' profile so much the better. If you want to contribute to CPAN Testers more directly, there is always the CPAN Testers Fund. We are always delighted to receive financial donations as it all helps to keep the CPAN Testers service going for many more years.