Talking 'Bout A Review Revolution
by: dkpatriarchEd Intro: "I ask dkpatriarch (David Hilton) from the Aust-Xbox-Forums if he'd like to contribute some stuff to the XboxOZ360-Blog, as I often find his news items and points of view very interesting and well worth reading. So here's his first piece regarding the latest kuffufl over GameSpot and Edios's "preview/review" of one of their games that saw a high ranking GameSpot staffer "removed" from his position, with various "accusations" being passed back-n-forth over the net. - Welcome aboard mate." So Gamespot.com, one of the most popular gaming sites in the universe, got caught out doing a "cash for comment" deal with Eidos. The well publicised story of the Gamespot editor and reviewer who got sacked for giving Eidos' game Kane & Lynch a low score of 6 when Gamespot had a lucrative advertisement deal with Eidos has been doing the rounds on forums and emails all over the place. Gamespot denies it, of course, but there are too many coincidences to truly buy the PR attemps. But I'm not interested at the moment with the arguments that have been raised about online gaming journalism's integrity; that's being dealt with elsewhere. I'm interested in the power of a number. It was the '6' that caused the commotion; the relatively low overall gamescore given the game. This scandal reveals just how important review score numbers are; so important perhaps that publishers like Eidos and others will invest in a website expecting a better overall review score to sell more games for them. With the rise of online gaming magazines and websites the game review score now reaches even more people than did those in the trusty gaming mag we had to pay for. These online game websites generally have no subscription income and so rely on advertisement dollars. That opens them up to pressure from advertisers. Advertisers know how popular and how important online review scores are or they wouldn't use the website. How many of us, myself included, visit a site like Gamespot, or ever better, Gamerankings or Metacritic, to see how a game 'scores' overall? One single number and we see that the game is gold or rubbish. We don't even need to actually read a review.
The problem, besides the possible conflict of interest issue, is that these powerful review scores do not really sum up a game properly. They certainly do not represent the tastes of all gamers either. Nor are they consistant.
For example, the latest 360 Magazine scores Assassin’s Creed a 2 star out of 5. That’s on par with this month’s review score for Beowulf and Kengo Zero, and higher than Cars: Mater National and Viva Pinata Party Animals. However it is lower than Scene It, a game they say “fails to beat Buzz! in almost every respect”, and Tomb Raider Anniversary, which it describes as treading “a very fine line in that perhaps its very existence (on 360) is questionable when you consider what you’re really getting here is just a hi-res PS2 game”. So a glorified dvd game apparently worse than Buzz! and a hi-res PS2 game are better than Assassin’s Creed. Bullocks. No matter its failings it does not deserve that kind of comparison. Assassin’s Creed has also been getting high game review scores from other reviewers, and without a doubt displays amazing technical achievements.
So here is the problem in mirco: the review scores as they are do not really work. A ‘Big Brain Academy’ (which Gamepro gave a 9) cannot really compare to the latest Zelda or Mario Galaxy, despite the similarity of score given by game sites. They may all be fun but which are obviously technically better? Also you cannot compare a score for COD4 or Halo 3 to Viva Pinata or Assassin’s Creed. They simply aren’t the same kind of game. Some gamers will want high level technical detail like graphics and tight controls; others are more impressed by the ‘fun factor’ of a game, or its replayability. All gamers want to know that they are comparing apples to apples and a straight comparison of overall game scores does not contextualise the game. So, short of forcing every gamer to actually read all the reviews or to eliminate the overall game score altogether, what can be done to improve review scoring?
Do you trust reviews/previews . . . ?
I believe the overall games review score needs to be split at least into three. Category 1 would be an overall score for technical aspects like sound, graphics, control scheme, frame rates, clipping, and design etc.. Category 2 would be an overall score for gameplay or “fun factor” and replayablity. Category three would be a “better than” or “worse than” comparison with similar games in the genre, or a “new game genre” tag if it is a new type. In this way a review of Big Brain could say how technically it is rather simple, but that it is a lot of fun, and a new genre to videogaming, without being put in position where its score makes it seem like the same experience as Mario Galaxy or Zelda. The reviewer of Assassin’s Creed mentioned above would be able to show that technically Assassin’s Creed is a real achievement, but that in his opinion the ‘fun factor’ was lacking, without giving a ludicrously low 2 stars.
Sure, some review score systems incorporate elements of this model, but most rely on gamers to read the review to get the clearer picture and still rely on the overall score to mark the game as good or bad. I think reviewers need to move away from having one overall number to represent everything a game offers and gamers should not limit themselves to a ‘Metacritic’ average of these single overall scores. These numbers make or break a game, and it really is a shame for the many titles that may have had greater support if the game had not been distilled to a single puny representation of its merit. In some other cases it is a shame some titles end up purchased by gamers who are impressed by the high score but then find they wanted more fun or better graphics or a different genre altogether.
At the very least, if the review you skipped to the end of incorporated the three category system it might cost Eidos and their ilk three times as much to ensure positive scores….or be much more obvious if they and the pressured reviewer try to cheat on the game’s report card.
How about you? Have you got some ideas for a game review revolution? Share them in the comments section below!