CPAN Testers is only made possible with the support of our sponsors.
For more information on sponsoring, please visit the I CPAN Testers website.

Upgrade Notice

The CPAN Testers Blog site has been upgraded since you last accessed the site. Please press the F5 key or CTRL-R to refresh your browser cache to use the latest javascript and CSS files.

News & Views

CPAN Testers Reports had a quiet month in April. Much the work to upgrade the Generator code has brought the Metabase feed back into line, and the site builder has been coping much better. The Generator code changes have meant an upgrade to the lastest Metabase API, which has improved the throughput, but also the gaps between search times has reduced, meaning that the blocks we get back from SimpleDB are much more likely to be the ones we want. Also the logic to track back and find the gaps has been improved. All in all, its meant we haven't been missing reports.

There were two discussion on the mailing list, that highlighting two problems CPAN Testers currently has. In the first Sisyphus asked about a problem he'd spotted with failure reports being created, where prerequisites were unmet. The first cause for confusion is when two version of the same distribution (including the version number) are uploaded to CPAN. This can only happen when the author is different. Sisyphus, who maintains the official version of the module, had uploaded a development version, unaware that an unauthorized version had been uploaded. Technically the unauthorised version shouldn't really be tested, but there isn't anything in any of the CPAN Testers clients to prevent this, or at least issue a warning. As a consequence it was initially unclear which distribution the reports related to. As it turned out they were for the official version, which helped to further understand the problem originally raised, and which was eventually resolved by uploading a fixed META.yml file.

In the second discussion, Karen Etheridge asked about unmet 'configure_requires' prerequisites. The problem was due to old installer tools being used, which raise the interesting question of what should be an acceptable minimum version of the smoker tools used to smoke CPAN? This is a complex question to answer, as not only should we consider the perl version, but also the platform version. I have suggested we capture this on the CPAN Testers Wiki, so we can improve it over time. Ron Savage made an interesting suggestion of a new meta key of 'smoking_requires'. Although a nice idea, it would mean a significant change to all the installers, and I suspect that is unlikely to happen soon, if at all, as it would still require upgrading the installers to a minimum version first. Sometimes it can be useful to see how older installers cope with installing a recent upload, and in many cases there may be no problem at all. However, having a set of suggestion minimum requirements for smoking, would at least highlight what versions to upgrade to, should smokers hit problems where they frequent submit erroneous reports. It will be something we will be looking at more in thr future, but if anyone has some suggestions as a starting point, please feel free to start a wiki page or two.

This question also highlighted why it can be very useful to look at the CPAN Testers Analysis site, run by Andreas, which analyses a number of reports to see where differences and similarities are. Looking at a single report may not be enough to pin-point the fault, but understanding a broader range of reports, particularly across perls and platforms, may give a better picture to help dig deeper.

A. Sinan Unur posted about a slight change to the tests he'd made for Crypt::SSLeay. It highlighted that potentially some smokers had machines running smoke tests that could be vulnerable to the Heartbleed Bug. Although it was acknowledged that the test were incomplete, it did re-iterate the need for smokers to ensure their machines were appropriately upgraded if they were using OpenSSL. As a side note, all the CPAN Testers servers were upgrade within a day or two of the original bug being announced.

Bruce from NebCon Inc, posted about a situation I'd already flagged up with Chris Williams. On one of Chris' smokers reported a strange fault was happening that produced 'Argument "2.07_02" isn't numeric in subroutine entry'. It was an odd one because the error was related to File::Temp, which requested a minimum version of File::Path, but one that wasn't a developer version, '2.06'. David Golden suspected the way the version is evaluated may be the problem in UNIVERSAL::VERSION, which could be cured by installing a more recent of version.pm. A newer version of File::Path is available, so in te short term it may also be just as simple to require that.

David Golden has asked everyone to upgrade CPAN::Reporter, if you smoke with CPAN+CPAN::Reporter. CPAN 2.05 now reports optional prerequisites and David has updated CPAN::Reporter to be able to deal with these. This in turn raised a note from Reni Urban, where he had notice differences between Task and Bundle installs. His particular use case does show a difference in the way installers handle the Bundles, as opposed to Tasks, but for the particular Task::CPAN::Reporter suggested for this upgrade, it shouldn't be an issue.

We do have some more news in the pipeline, and have a few things to annouce before the end of May, so look out for those.

Posted by Barbie
on 8th May 2014

As part of the QA Hackathon in Lyon, one of my aims was to produce an API to the CPAN Testers reports. More specifically, provide the ability to retrieve a copy of a report as it was stored in the Metabase.

Although reports are available on the Reports site, they are a reconstruction of the report, as if it was an email. This was how reports were originally sent, but since 2010, the Metabase recieves each report as a JSON object, which it then saves with further references to other facts. When the Generator retrieves the JSON object it stores a copy before parsing the contents for the cpanstats database.

The reconstructed report pages might be enough to glean enough information to figure out the immediate problem, but sometimes it requires more long term analysis, which is where the Analysis site comes in. Currently Andreas scrapes the reports pages, which although mostly works, in the future, when we start to include much more metadata, this isn't going to be suitable. As such, returning a hash or JSON object is needed.

During the QA Hackathon I did manage to write the server side software to provide the API needed. However, I didn't get to finish the client side software. Over the following weeks I cleaned up the code, added some tests and finally CPAN-Testers-WWW-Reports-Query-Report was released to the world.

The module does a little more than I first planned, but that is mainly because I wanted people to be able to use the report returned in any way the see fit. So you can retrieve a Metabase::Fact object (the default), a Perl hash representation or the original JSON object. All that is required is the report Metabase GUID or the cpanstats ID. So how does it work?

The Options

You don't need to provide any options, as the module will use some defaults. Below is an example of the options, which incidentally happen to be the default options:

my %options = (
    as_json => 0,
    as_hash => 0,
    host    => 'http://cpantesters.org'
);

I allow you to defined the host, just in case we supply a similar API from the Metabase directly, or a new mirror (possibly even a local one) can support the same API.

The Object

Now we establish the object. If an options hash isn't provided, the defaults will be used.

my $query = CPAN::Testers::WWW::Reports::Query::Report->new( %options );

The Report

# get by cpanstats ID
my $result = $query->report( report => 40000000 );
# get by Metabase GUID
$result = $query->report( report => '0b3fd09a-7e50-11e3-9609-5744ee331862' );

The report method also allows you to override the as_json and as_hash attributes, should you need to retrieve the report slightly differently in some cases.

# force return as JSON
my $result = $query->report( report => 40000000, as_json => 1 );

The Last Error

In the event something goes wrong, you can hopefully get some information about what the problem was via the error() method. I can't guarantee it will be the whole picture, but hopefully it will give you some indication of what the problem might be.

my $error = $query->error;

The Conclusion

If you have suggestions for improvements, or spot problems in the current code, please let me know. If you do use the code, whether for mash-ups or for personal analysis, it would be nice to hear you've found the module useful. You could even blog about it :)

Once again apologies for the delayed summary this month. The day job has been rather taken up a lot of my time for the past few months, and I've had a few other projects that have needed my attention. Hopefully next month, I'll fair better.

First off for this month, I'd like to advertise the 2013 QA Hackathan, taking place in Lancaster, UK from 12th April to 14th April. This event with be the 6th QA Hackathon, and is looking to as successful as previous years. There are plenty of projects to work on, and plenty of developers willing to pitch in and help out. Plans for CPAN Testers include preparing for the move from AWS. We have an opportunity to clean up the problems of data storage, or rather data search, by moving to a new Metabase, which Dave Golden has been working on. There will plenty of other QA projects that will be worth watching too, so look out for the various blog posts and code releases during and after the event.

David Oswald ended January on the mailing list, with a question about tests for Bytes::Random::Secure on 5.6.*. It highlighted the usefulness of CPAN Dependencies, providing the data to show that pre-requisites were the problem, rather than the distribution itself. It also highlighted why referencing cp5.6.2an at cpxxxan.barnyard.co.uk, could help out those users who need reliable installs of particular distributions for their version of Perl.

The discussion regarding NFS continued into the month, as detailed in my delay summary from last month. Buddy Burden asked about comparing test reports. Although not quite the same, it does follow on from other similar requests about getting at data within reports. There are two problems at the moment, the first is reliably getting at the metadata of a particular report. Although I could open up an API to this data, it doesn't help with the second problem, which is the ability to compare structured data. Currently reports are compiled mostly as a single piece of text. To be able to properly compare reports, it would be better to use structured data. At the last QA Hackathon making this more of a reality was discussed and some code was even released to help push it forward. However, there is still some way to go, and hopefully at the 2013 QA Hackathan we may see some more movement on this. In the meantime, the CPAN Test Analysis site may well provide some of the comparisions you may be looking for.

Matthew Musgrove asked if he could use the same distroprefs with multiple smokers. For those unfamilar with the distroprefs files, these are the files used by testers to help filter out distributions that are problematic when running automated tests. They are also referred to as the ignore lists. These files typically are used with individual smokers and are not shared between them. However, there are a few ways this could be handled. Some use a source code repository (sometimes on GitHub), to sync between their smokers. David Cantrell told us he uses the more traditional method of shell script and rsync. However, you share your distroprefs between smokers, bare in mind that you'll be excluding distributions that may not be a problem on some of your smokers. As such, it may be worth retrying some distributions over time to see whether old issues have been resolved.

David Oswald top and tailed the month with another post asking for help understanding a FAIL for Bytes::Random::Secure. David had identified that the failure was due to Crypt::Random::Seed not being installed, however it was explicitly listed in the prerequisites. David Cantrell pointed to the very long @INC, which some smokers use to avoid installing distributions, but reference the installers build directory. This can be problematic for two reasons. Firstly, long running smokers can blow the length of @INC, such that paths added to the end only get ignored. Secondly, if the installer has a limit for the amount disk space used, it may remove older distributions before running tests. To avoid this, many smokers will automatically stop and start their smokers to keep the @INC at a manageable length.

We passed 28 million reports at the beginning of February (congratulations to Peter John Acklam for his PASS submission of Compress-Zlib-Perl), and passed 29 million (even more congrats to Peter John Acklam for his UNKNOWN submission of Regexp-Assemble) at the beginning of March. I suspect we may well hit 30 million around the time of the 2013 QA Hackathan, which would be rather nice to celebrate while many of us are all together. In February, Thomas Peters (WETERS) held the 6000th unique PAUSE account to upload a distribution. Interesting to see that the number of authors currently submitting to CPAN has stayed pretty constant over the past few years. We're also gaining around 30-40 new active authors every month too.

That's all for now. Hope to see you in Lancaster next month, and I'll do my best to get the next summary out before then too!

With 20 million reports, CPAN Testers is very definitely one of the biggest online repositories of test reports for both programming languages and software applications. While other languages and applications may have larger communities than CPAN Testers, the Perl community's commitment to testing has uniquely enabled CPAN Testers to build on the benefits from many areas of the test community. The TAP protocol has now been incorporated into several other language and application testing frameworks, and stand-alone applications, such as Smolder, have been able to harness the output to present results in a way which best highlights problem areas. CPAN Testers too has been striving to improve the way we present and provide analysis of reports.

With regards to the analysis CPAN Testers provides, a notable example has been the work by Andreas König with the CPAN Testers Analysis site. By taking a selection of test reports, his tools have the ability to find common areas which help authors to pin-point unusual problems. One aspect this has recently helped with, is identifying a problem with development releases, which in themselves don't necessarily fail their tests (possibly because the don't include tests which test all scenarios), but when used within other modules may highlight faults in the prerequisite. The problem with this is that the report is associated with the calling distribution, not the prerequisite which has the fault. Andreas' tools recently were able to identify a prerequisite that was causing concern for a number of distributions, even though the fault lay elsewhere. This in turn led to a lengthy discussion on the CPAN Testers Discussion List, of how we could best approach this. The difficulty though is writing parsers that would automatically determine what was a fault. With many of the test reports it requires a human to determine this for single reports, while Andreas' tools can only highlight where further human investigation is needed, and needs many reports to determine if there might be a cause that may need investigating.

In the short term it is possible to re-associate reports with other distributions, by requesting to me, but in the longer term it would be better to enable a automated process. Which is where the Admin site comes in again. Although that's been on hold for a while, I think this is another feature which could simplify the process of correcting mis-filed reports. As such, I hope to continue on that site soon.

The discussion on the mailing list asked whether we could more appropriately filter these types of reports, but didn't lead a conclusive answer. However, it will be a discussion topic for the forthcoming QA Hackathon. At the end of the month, a number of testing hackers will be assembling in Paris to further Perl's testing infrastructure, processes and code. It will be a good opportunity to have a number of different views discussed in one room with all those likely to be actioned with writing the solutions. This QA Hackathon looks to be the biggest so far with roughly 40 people attending. There are a diverse set of projects being planned for the 3 days, so expect lots of output afterwards.

Gabor Szabo highlighted an issue running CPAN-Reporter with Net-HTTPS-6.02 last month on the mailing list. Thanks to Manoj Kumar, if you experience a "fact submission failed" error, try rolling back to Net-HTTPS-6.01. Tim Bunce also asked about testing on Windows. Although Windows is still a regularly tested platform, the volumes are considerably less than for other more Unix based OSes. David Golden put in a lot working getting CPAN-Reporter and Strawberry Perl working on Windows, so if you have the spare capacity and want to get involved with CPAN Testers, Windows is certainly an ideal platform to make a difference. And if you're running Windows, Cygwin would welcome some more regular reporting.

Lastly, for now, the problems with the SQLite Database have surfaced again. Although I can't definitely say why there is a problem, it would seem to be related to the sheer size of the data CPAN Testers stores. The uncompressed version is now over 8GB. To reduce the bandwidth, and better provide data in a way that consumers can store and use it, an API is being written to supply a record set, rather than the full data set. Expect further work on this during the QA Hackathon.

Last month CPAN Testers was finally given a deadline to complete the move away from SMTP to HTTP submissions for reports. Or perhaps more accurately to move away from the perl.org servers, as the amount of report submissions has been affecting support of other services to the Perl eco-system. The deadline is 1st March 2010, which leaves just under 2 months for us to move to the CPAN Testers 2.0 infrastructure. Not very long.

David Golden has now put together a plan of action, which is being rapidly consumed and worked on. The first fruits of which has been an update to the CPAN Testers Reports site. The ID previously visible on the site, refering to a specific report, is now being hidden away. The reason for this is that the current ID refers to the NNTP ID that is used on the perl.org NNTP archive for the cpan-testers mailing list. This ID is specific to the SMTP submissions and includes many posts which are not valid reports. As such we will be moving to a GUID as supplied by the Metabase framework, with existing valid SMTP submitted reports being imported into the Metabase. The NNTP ID will eventually be completely replaced by the Metabase GUID across all parts of the CPAN Testers eco-system, including all the databases and websites. As such you will start to see a transition over the next few weeks.

The second change which has now been implemented, is to present the reports via the CPAN Testers Report site and not the NNTP arcive on the perl.org servers. Currently the presentation of a report (e.g. this report for App-Maisha) is accessed via the reports pages for a distribution or an author, but will also be accessible in a similar manner across all the CPAN Testers websites. There are a large batch of early reports that are currently missing from the database, but these are being updated now, and will hopefully be complete within the next few days. If you have any issues with the way the reports are presented, including any broken or missing links from other parts of the site, please let me know.

In all this change, there is one aspect that may worry a few people, and that is the "Find A Tester" application. For the next few months it will still exist, but the plan is to make the Reports site more able to provide tester contact information. In addition to this the testers themselves will soon have the ability to update their own profiles. Initially this will be used to link email addresses to reports and then map those email addresses to a profile held wihtin the Metabase, but in the longer term will be used to help us manage the report submissions better.

David Golden is concentrating on the Client and Metabase parts of the action plan, and I am working on porting the websites and 'cpanstats' database. If you have any free time and would like to help out, please review the action plan, join the cpan-testers-discuss mailing list, and please let us know where you'd like to help. There is a lot of work to be done and the more people involved, the better the spread of knowledge in the longer term.

After David announced the deadline last month, all the testers have throttled back their smoke bots. This saw a dramatic reduction in the number of reports and page being processed, and enabled the Reports Page Builder to catchup with itself, to the point it was frequently having less than a 1000 request waiting. That changed yesterday with the changes to the website, as every page now needs to be updated. It typically takes about 5 days to build the complete site, so this quiet period will help allow the Builder to rebuild the site, without adversely affecting the currently level of report submissions. Expect the site to reach a more managable level of processing some time next week. To help monitor the progress of the builder, a new part of the Reports site, The Status Page, now checks the status of all outstanding request every 15 minutes, providing a 24 hour persepctive and a week long perspective.

A new addition to the family was also launched recently, the CPAN Testers Analysis site, which Andreas König has been working on, to help authors identify failure trends from reports for their distributions. Read more on Andreas' blog.

Last month we had a total of 168 tester addresses submitting reports. The mappings this month included 22 total addresses mapped, of which 2 were for newly identified testers. Another low mapping month, due to work being done on CPAN Testers as a whole.

My thanks this month go to David Golden for finding the time to write an action plan, and his wife for allowing him the time to write it, as well as working on all the other areas involving the CPAN Testers and the Metabase :)