Having been made suspicious by the large number of “5-star” ratings his software got from download sites, Andy Brice of Successful Software decided to run a little experiment. He took a text file with these words:
This software does nothing.
It doesn’t even run.
I was created as an experiment to see how many shareware awards it got.
See the results of the experiment at:
He gave the file an .exe extension and gave it the asking-to-be-caught name “awardmestars”. He also included a PAD file — that’s “Portable Application Decsription”, a standard for describing software in the shareware industry — that clearly indicated that the software did nothing at all.
In spite of all the warnings he provided, plus the fact that it was a non-functional non-application, he still managed to rack up these 16 awards:
As regular readers of this blog know, I work for Tucows, whose original business was being a place that reviewed and hosted downloadable shareware.
An award from Tucows is not given lightly. In fact, just to make it on to our site, a software title needs to maintain a minimum three cow rating, and it needs to generate downloads. Titles that do not maintain an appropriate level of popularity are removed from the library on our site.
We offer a truly “best of” collection of software. One of team members reviews every single piece of software that is submitted. In fact, over 70% of the submissions to Tucows are rejected because they fail to meet our stringent ratings criteria. In a nutshell, for Windows applications (we have different rating scales for Mac/Linux/Games, etc.), Tucows uses a 56-point rating scale with a large proportion of the rating based on usability (21 points), we allot up to 14 points for Help, Documentation and Support, 10 points for program enhancements, and 11 points for the opinion of the reviewer. The Tucows rating guide is so standardized, that a third-party site provides a “Tucows Rating Calculator” where software authors can analyze their title to get an idea of how it would rate on Tucows.