MandM has moved!

You should be automatically redirected in 6 seconds. If not, visit
http://www.mandm.org.nz/
and update your bookmarks.

Monday, 26 January 2009

Top 10 NZ Christian Blogs December 08

Two New Zealand blog sites run monthly stats ranking the top New Zealand blogs on public discourse, Tumeke and Half Done. Both use different formulae so the stats don't always come out the same.

Previously I have accorded Tumeke's stats as the official ones and Half Done's as the shadow report, comparing Half Done's stats that come out a lot earlier with Tumeke's from the previous month. This practice seems a bit silly, it makes more sense to compare stats from the same month and as bloggers do not agree which formulae is the better, I figured we should rethink how we present the top 10 Christian stats which we run monthly.

Going forward we will publish:

Christian Blog Rankings for Month Year – Tumeke
Christian Blog Rankings for Month Year – Halfdone

And once we have both sets of stats from each:

Top 10 NZ Christian Blogs Month Year

The top 10 will average the rankings from each and will form the official MandM ranking.

The following ranks the Top 10 NZ Christian Blogs (public discourse) for December 08 by comparing Half Done's December 08 stats with Tumeke's December 08 stats and averaging them to obtain their overall December 08 ranking.

[Rank. Blog Average (Half Done Tumeke)]

1. NZ Conservative 16 (10 22 )
2. Something Should Go Here, Maybe Later 20.5 (17 24)
3. The Briefing Room 27 (27 27)
4. MandM 37.5 (25 50)
5. Kiwi Polemicist 66 (61 71)
6. Say Hello to my Little Friend 79.5 (65 94)
7. Put up Thy Sword 82 (46 118)
8. Samuel Dennis 90 (51 129)
9. Contra Celsum 104 (117 91)
10. Star Studded Super Step 116 ( - 116)

Other Christian blogs featuring in the top 200 NZ blogs on public discourse for December 08:

Gavin Knight 120 (109 131)
Blessed Economist 137 ( - 137)
Section 59 Blog 156 (137 175)
Definitive 164 (165 163)
NZ Debate 176.5 (159 194)
The Voice of Reason NZ 199 ( - 199)

The difference between Half Done and Tumeke's stats in some instances is quite remarkable given they both measure the same month; there is more than 70 points difference each between Put up Thy Sword and Samuel Dennis yet The Briefing Room scored the same on both.

You can go to each site and read how their respective formulae work but essentially both base their formulae on unique visits and incoming links; Half Done leaves it there, but Tumeke adds to his the average number of blog posts and comments.

Some say you shouldn't be awarded for volume of posts and comments, quality is what should be measured and quality is best measured by visits and links to entries. The other argument is that comments are indicative of interest and that hits alone can be unfairly influenced simply by the type of group the blog might be associated with or where it is advertised and that such blogs might get disproportionate blog hits even if they have posted very little all month.

Although I tend to lean towards Half Done's formula (a range of issues - nothing whatsoever to do with the fact we score considerably better on Half Done's formula), I figure averaging both formulae gives you the best of both worlds.

Note: This list does not include Christians who blog but whose blogs are not identifiably Christian and is based on Tumeke's classification methods.

If you think your blog should be on the rankings click here.

9 comments:

  1. I'm actually musing on some sort of change to the formula to stop producing such ridiculous large numbers and also to penalise those who don't get a NZ ranking less.

    You might have seen my post where I showed that with a sample of 30, 3 different ways of using the same 3 data produced different results!

    ReplyDelete
  2. I had a look at Tumeke's formula, it didn't seem to make much sense at the time. It is not the maths, it was (what I perceived as) lack of clear explanation. Half done seemed much simpler to follow.

    I think that comments should be included as well as incoming links and unique page loads. The question is whether posts should count.

    The other question is how they are weighted. 25% from each category, or more for page loads and less for links and comments.

    I have tried claiming my blog at technorati. It has been previously flagged but no reason is given. I contacted them over a month ago but no response. Also technorati probably updates the most popular blogs but seems to take a long time between reviewing other blogs.

    ReplyDelete
  3. The comments issue is interesting. Sometimes I think yes, they should be factored in, as it is a definate sign of interest, but then it is too easy to manipulate - scam commenting, not deleting spam, being deliberately provocative, etc..

    I like the idea of weighting them appropriately though rather than every factor being equal.

    ReplyDelete
  4. Hello. You might be able to deduce Tumeke's system by looking at the stats I count, and noting which ones contribute to the Tumeke score: An example here

    Regarding counting comments - next time around I'm going to count up the number of unique commenter names we have over a month (we don't accept "anonymous" commenters, only those with valid blogger IDs (which includes nom-de-plumes such as yours truly).

    ReplyDelete
  5. The differences in stats seem to result in data differences and not from the different formulae. Tumeke and HalfDone have very Alexa scores for Samuel Dennis (one is twice the other). For Put up the Sword, Tumeke uses a sitemeter reading that is totally inconsistant with the Alexa score used by HalfDone and with Tuemeke's Alexa reading. Their Alexa readings for the Briefing Room are very similar.

    Halfdone has a much lower NZ Alexa for MandM than tuameke, which explains why you do better in his list.

    I do not think that comments and posts numbers make much difference. The absolute value of the numbers in unique visitors and links is much higher than the other two indicators, so they have a greater implicit weighting when the numbers are added together. For example, 97 out of your 118 score comes from vistors and links. For most other blogs that share is even greater.

    One issue that has not been considered is that, as far as I can tell, Alexa only picks up visits to the blog site. It does not count people who read the blog using an RSS reader. You would have to go to Feedburner or something similar to count the number of RSS readers. So the number of RSS readers should really be included in the statistics too.

    ReplyDelete
  6. I'm looking forward to January's stats, it's been a bumper month (I monitor my stats with a cool wordpress plugin). But I have to say I like Half done's ranking a lot more. :)

    ReplyDelete
  7. Hm, interesting point about the Alexa scores.

    I try (last time was an exception) to put up number as fresh as possible so people can go to alexa themselves and see whether or not I'm making them up.

    Unfortunately as a ranking, they do get a bit meaningless for the lower ranked blogs as small differences in traffic can make large differences in ranking.

    ReplyDelete
  8. I would think that the difference in the Alexa scores RonMck mentions came down to the fact that Scrubbone gathers his pretty much the minute the month he is analysing has come to the end, whereas Tumeke tends to do his stats much later, around the 20th of the following month. Both statisticians are taking their scores for the month at different points in the month and, of course, traffic continually changes.

    Further, when I try to calculate Tumeke's score (the one at the end that they seem to all be ranked in order of) he seems to not use the Alexa and NZ scores at all in them; it is as if they are just there for decoration - or am I wrong? (I definately have noted this on ours in the past but am too sore sitting here to go over there and check other sites scores)

    ReplyDelete
  9. I'm not sure how well the rankings measure actual consumption of blog posts, at least in terms of people who regularly view blogs. My posting rate was pretty appalling over December, and last month, yet my stats for December have hardly changed. Sitemeter shows most of my references to come from mostly irrelevant google searches!

    I often wonder if these ranking systems take into account being on people's blogrolls, and how that would affect things.

    Still, interesting reading, keep up the good work.

    ReplyDelete

Note: only a member of this blog may post a comment.

  © Blogger template 'Grease' by Ourblogtemplates.com 2008 Design by Madeleine Flannagan 2008

Back to TOP