Wednesday, July 09, 2008
Re: Regression test framework
3.0t released in December 2007 has turned out to be extremely stable, and I'm busy preparing the first non-beta release candidate. The problem with being stable is the risk of regression, so I've been nervously putting together a new regression testing framework to help exercise obscure bugs.
The most critical bug reported since 3.0t is a non-blocking API-specific problem. Certain bugs only affect the API, not the xdelta3 command-line application, because the command-line application uses blocking I/O even though the API supports non-blocking I/O. Issue 70 reported an infinite loop processing a certain pair of files. The reporter graciously included sample inputs to reproduce the problem, and I fixed it in SVN 239.
But I wasn't happy with the fix until now, thanks to the new regression testing framework. With ~50 lines of code, the test creates an in-memory file descriptor, then creates two slight variations intended to trigger the bug. SVN 256 contains the new test.
So just what is it that makes me feel 3.x is so stable? It's e-mails like this from Alex White at intralan.co.uk:
3.0t released in December 2007 has turned out to be extremely stable, and I'm busy preparing the first non-beta release candidate. The problem with being stable is the risk of regression, so I've been nervously putting together a new regression testing framework to help exercise obscure bugs.
The most critical bug reported since 3.0t is a non-blocking API-specific problem. Certain bugs only affect the API, not the xdelta3 command-line application, because the command-line application uses blocking I/O even though the API supports non-blocking I/O. Issue 70 reported an infinite loop processing a certain pair of files. The reporter graciously included sample inputs to reproduce the problem, and I fixed it in SVN 239.
But I wasn't happy with the fix until now, thanks to the new regression testing framework. With ~50 lines of code, the test creates an in-memory file descriptor, then creates two slight variations intended to trigger the bug. SVN 256 contains the new test.
So just what is it that makes me feel 3.x is so stable? It's e-mails like this from Alex White at intralan.co.uk:
- Hi Josh,
FYI:-
Been using XDelta for a while now, been working flawlessly (I wish all software could be this stable), been patching up to 1tb of data per day (across many servers), largest single file to date 70gb!!!
Did some performance testing and with standard SATAII drives with both sources and the patch being created on the same drive the processing was around 300mb per minute (creating patches), setup a dual drive configuration where one of the sources was on a second drive and then the processing was around 1gb per minute (creating patches) .
Best large file patch to original file size ratio,
Original file size: 56,147,853,312
Patch file size: 299,687,049
Patch file reduction over original file size 99.47%
This file took 2 hours and 21 minutes to patch in a real world setting.
Comments:
Post a Comment