Categories
Uncategorized

Would Tucows Have Failed Andy Brice’s “Software Awards” Test?

Five squishy cows turning their backs to a pile of $20 bills.

Having been made suspicious by the large number of “5-star” ratings his software got from download sites, Andy Brice of Successful Software decided to run a little experiment. He took a text file with these words:

This software does nothing.

It doesn’t even run.

I was created as an experiment to see how many shareware awards it got.

See the results of the experiment at:

www.successfulsoftware.net

He gave the file an .exe extension and gave it the asking-to-be-caught name “awardmestars”. He also included a PAD file — that’s “Portable Application Decsription”, a standard for describing software in the shareware industry — that clearly indicated that the software did nothing at all.

In spite of all the warnings he provided, plus the fact that it was a non-functional non-application, he still managed to rack up these 16 awards:

The 16 awards Andy Brice got for his nom-functional non-application.

As regular readers of this blog know, I work for Tucows, whose original business was being a place that reviewed and hosted downloadable shareware.

Would we have given Andy Brice’s non-application an award? No. Why?

Silhouette of 5 cows: 'These 5 cows don't come easy'An award from Tucows is not given lightly. In fact, just to make it on to our site, a software title needs to maintain a minimum three cow rating, and it needs to generate downloads. Titles that do not maintain an appropriate level of popularity are removed from the library on our site.

We offer a truly “best of” collection of software. One of team members reviews every single piece of software that is submitted. In fact, over 70% of the submissions to Tucows are rejected because they fail to meet our stringent ratings criteria. In a nutshell, for Windows applications (we have different rating scales for Mac/Linux/Games, etc.), Tucows uses a 56-point rating scale with a large proportion of the rating based on usability (21 points), we allot up to 14 points for Help, Documentation and Support, 10 points for program enhancements, and 11 points for the opinion of the reviewer. The Tucows rating guide is so standardized, that a third-party site provides a “Tucows Rating Calculator” where software authors can analyze their title to get an idea of how it would rate on Tucows.

Cross-posted to the Tucows Developer Blog

4 replies on “Would Tucows Have Failed Andy Brice’s “Software Awards” Test?”

Coincidentally I have just received a rejection email from Tucows. But it was too late for the article.

“Unfortunately, we are unable to accept your program at this time for the
following reason(s):
This is not a real program. Nice try though :)
I hope your experiment goes well. ”

So there obviously still is a human in the loop at Tucows.

My real software (www.perfecttableplan.com) has a 5 cows rating.

@Andy Brice: There’s a whole department devoted to the task, headed by Greg Weir, a fine gentleman with a deep knowledge of the shareware industry and Barry White-like basso profundo voice.

Comments are closed.