The database shows most of them at the Multi-Pass stage, with the exception of the ones that were aborted. My worker at home went down, so there were only two processes working on multi-pass images. I didn't really drum up as much interest as I thought I would, so I haven't done much work on the project and only check it once in a while.
And you're right, if you're willing to use the command line or write your own scripts (or have paid for the GUI version) this tool really doesn't compare in terms of speed. The idea was a "upload and check back later" type system. The API was supposed to allow web masters to, over-time, re-compress their images without using their own server's processing time (which is generally frowned upon with shared hosting). However I got such little interest in the tool that I didn't really find the motivation to try and finish the API docs.
Maybe I went about it the wrong way. I thought maybe about axing the web tool and just using the API to spider web sites for web masters that wanted it. I don't know, any ideas?