After a thorough feedback session from readers on my PCLOS review, and my Five Way Comparative Test, I have come to the uncomfortable conclusion that improvement is needed in how I approach reviews. I spend quite a bit of time on every OS I review, but that lends to subjective responses.

A little subjective opinion is needed in every review, for sure, but maybe some objective criteria, measurable scoring system, is needed.

One thing I noted, was that PCLOS performed a lot better in my comparative test than my review of it suggested that it should. Mint performed slightly worse than I expected. There must be a way to use my five OS ratings as a standard to measure new Operating systems by. Below I will throw some ideas out there, and maybe come up with a rating system that not only removes some of the subjective “like” factor from my reviews, but also makes sense.

Useability is Important

As far as OS reviews go everyday useability is paramount in my opinion. That was the whole point of my reviews in the first place. Problem with my approach was that I had no rating metric to measure feedback from me and some of the people I ask to try whatever OS I give them to play with. My wife uses Ubuntu, and Xubuntu, so her opinion is skewed (she is very verbal about her criticism of the new window button layout in Lucid btw.) and my mother still uses Intrepid.

So using my Five OS Comparative test as a base I am thinking of using the six tests used in it, namely:

  • Join a wireless network.
  • Join a wired network.
  • Set up System Proxy.
  • Get Network/Connection Information.
  • Change Desktop Background.
  • Change Desktop Theme.

I can score these on the “amount of clicks” test that I have done for now. I expect a little refining of the method to take place over a few reviews.

Added to these six, I need to add a few more. As per reader feedback I will use:

  • Setting up a USB printer.
  • Using a Network Printer/Shared printer.

Now those two will be ease of use tests, but printer support wise will be very hardware dependent. I have had no problems printing to a Samsung network printer from Ubuntu in our office, but users with Win7, especially the 64bit version has had some hassles. Don’t even get me started on HP/Win7 support problems.

Also per feedback I will add these:

  • Setting up a network share.
  • Using a network shared drive.

That brings it up to ten for useability tests.

System Information Tests

Well here things get tricky. As mentioned in previous entries, I simply do not have access to enough hardware to do proper compatibility tests. I can do some basic stuff, like:

  • How easy is it to gain hardware information (CPU Type/Clockspeed, Amount of RAM, Disk Size, OS Info.)
  • How easy it is to gain usage information (Network Download Speed, CPU Usage, RAM usage)
  • “Weight” of the OS (How much RAM/CPU it uses when idle and no programs running)
  • Boot Time.
  • Architecture support (64bit/32bit etc)


Here I can focus on more subjective things. Look, feel, Menu ease of use, packaging etc. Help dialog can be added here as a value to be scored.

Desktop Readiness.

I need to include this as a separate class. This will focus purely on desktop ready tools that have been provided. Does it have an office suite? What kind of browser? Productivity tools? And so on.


Having all these tests are nothing with a scoring baseline. What I will do, and this will be done over the weekend, is to take the five Operating Systems that I have and expand the test to include the tests not done so far. Since out of the five there will be scoring information, they will provide the base against which all newcomers will be compared. The same scoring system I have used will suffice. If an OS equals the best OS in a test, it scores 5. The score drops downward depending on how it places compared to previous tested OS’s. If it betters the top scoring OS in a test it scores 6.

Say for instance we test Xubuntu (my next tested OS btw…) and it betters PCLOS in setting network proxy, then it scores 6.

I am sure I will refine the scoring over the weekend, especially as some tests cannot be scored like that. PCLOS having only 32bit support will need to have a sane score applied to that omission that makes sense. Maybe 1 for supporting 32 and 64bit architectures 0 for supporting only 32 or 64, and then another point for supporting more than the normal PC architecture, like PowerPC for instance. So PCLOS would score 0 there, out of a possible maximum of 2.


That should about cover it, don’t you think? I will have a look at expanding the Five Way Test over the weekend, and make refinements to my scoring system in order to make future reviews less biased, and more fair. That said, expect a fare helping of opinion gooied in for good measure. What would a blogger be without his soapbox?

What do you guys think, did I miss anything?

Related posts:

  1. Review: Puppy Linux Lucid Puppy – With Screenshots
  2. Impressions – Haiku R1 Alpha 2 – With Screenshots
  3. Review: PCLinuxOS 2010 Gnome – With Screenshots