Why can’t we measure online traffic?

3 minute read

Over the past decade I have investigated various methods of measuring online traffic for the different web sites I’ve managed, and no single measurement seems to bear up to scrutiny. In fact, most do a pretty good job of contradicting each other. Google Analytics? Smarter Stats? Nielsen? Hitwise? Alexa? Take your pick, and think of a number.

measuring traffic can be a real pain...This problem was highlighted recently with the publication of Internet Advertising Bureau’s report (1) showing that online ad spending in Australia has just topped the $2bn mark (doubling over the past 4 years). Another recent report(2) predicted internet advertising will nearly double again over the next 4 years to leapfrog newspapers and TV to become the number one advertising medium in Australia.

The $2bn breaks down roughly to half in search/directories (so, mainly Google Adwords), a quarter in online classifieds (mainly real estate, cars and jobs) with the rest spent on general online display ads (skyscraper banners and the like).

The reports bemoaned the fact that although you could measure dollars spent on online advertising, measuring traffic to & from the ads was a guesstimate at best.

With this growing dominance of internet advertising, it would seem important that we can actually measure the precise number of visitors visit our web sites (and to which pages), something the internet was supposed to deliver. All businesses, even the smallest real estate agency office, should know how many people are coming to its online shop window.

Every time a web visitor clicks onto a web site, their visit should be recorded in the log files on the server (unless there’s some serious caching going on). Server-installed software (such as ‘Smarter Stats’) can interpret these and draw some pretty charts in real time, and send you daily/weekly reports by email so you can keep an eye on things.

Alternatively (and maybe additionally as a double check) install Google Analytics code into your web site code, and let Google measure the traffic, as most people do. And/or for those with high traffic sites wanting third party advertising dollars, Nielsen tags are supposed to do the same thing. Or sign up to Hitwise.

For those that can only really afford Google Analytics (it’s free to install), you may have checked the click thrus from, say, your enewsletter to the referrals from the same enewsletter to your web site (as measured by Analytics).

Or those that have paid for advertising on Google Adwords, or other places, may have tried to compare the reported click thrus with the referral traffic measured in the other direction. Shouldn’t every link one way be equal to the referral click measured the other way?

Of course none of these measurements tally up. Nothing like. I find variance by a factor of 2 or 3 times on all the above.

So who’s right? Are they all wrong? I’ve asked many online marketers and ebusiness people and no one seems to have the answer.

End Notes

1. IAB Australia’s Online Advertising Expenditure Report, compiled by PricewaterhouseCoopers,  Aug 2010

2. “Internet Advertising set to Dominate”, Lara Sinclair, The Australian, 2nd Aug 2010

Photo – Gizmodo (Google StreetView guys)

Tell us if you liked this content.
Show CommentsClose Comments

19 Comments

  • Craig
    Posted September 13, 2010 at 9:11 am 0Likes

    Measuring UV’s is not that hard, but because there are so many other metrics that can also be measured companies like to ‘manipulate’ the stats to suit their own agenda and only public release those stats.

  • Nick
    Posted September 13, 2010 at 9:20 am 0Likes

    “There are three kinds of lies: lies, damned lies, and statistics.”

    Unless you know and fully understand exactly how your statistics package collecting the statistics, the numbers it puts out might as well be randomly generated.

    E.g. if you analyse your web logs, but do not understand the implications of that, you’ll get wonderful big numbers that show that your website is doing really well. Remember that web logs include ALL traffic, including web crawlers such as Yahoo and Google so they dont really count as traffic. Something like Awstats works well at separating the crawlers from the report.

    Google Analytics and the other javascript based ones are quite good at counting actual people, but you have to remember that they rely on javascript to be working or otherwise the visitor will be invisible to you. Extensions like NoScript make it more likely to occur than you think.

    So basically web log analysers will overshoot your actual visitor numbers, and the javascript based ones will undershoot hence the variance.

    Craig UVs has its own pitfalls, especially if it is calculated from the logs. Google has 100+ crawling IPs. Should they count as a user each time?

  • Craig
    Posted September 13, 2010 at 9:26 am 0Likes

    Nick, keeping on top of crawlers is one of the biggest issues of analytics. I have used Smarter Stats in the past and if anyone wants an overnight boost in their traffic then I would look at it as it seems to to a pretty poor job of filtering crawlers from real traffic. Currently I mostly use GetClicky and Google Analytics and they are fairly consistent in the traffic they report and seem to do a good job of filtering crawlers.

  • Nat
    Posted September 13, 2010 at 10:43 am 0Likes

    good article… and Nick some good points !

    Just wanted to mention, add to this already existing ‘fruit salad’ of inconsistensies the fact that the ‘average’ person will tend to use the terms unique vistor, visitor, hits, visits, page vists all interchangeabley !!! (despite repeated education on the issue)

    Seems when it comes to web stats, very rarely is it apples and apples.

  • Charlie
    Posted September 13, 2010 at 10:45 am 0Likes

    Smarter Stats has an option to exclude all crawlers (non human traffic) – are you saying this does not work?

    If 2 measurements come up with the same number (and loads don’t) does this prove these 2 are the right stats, or just are making the same mistakes?!

    I find this whole area decidedly dodgy, and after 11 years of asking these sorts of questions and analysing data, no one yet has given me any confidence in any measurement.

  • Robert Simeon
    Posted September 13, 2010 at 10:58 am 0Likes

    Take the crawlers out and REA would have nothing to talk about – given more and more consumers are becoming aware that UV’s are an exaggerated methodology that are not accurate. I recently said you have David Jones telling consumers how many pedestrians walked through their store each month as UV’s are the online version of window shopping.

  • Craig
    Posted September 13, 2010 at 11:05 am 0Likes

    Hi Charlie. I tried the option on Smarter Stats to exclude crawlers but it didn’t seem to be reliable. I do have more faith in GA and GetClicky to be continually updating their code for checking for crawlers. I would be surprised if Google don’t have a couple of people at least dedicated to managing this process as the number of crawlers is amazing.

  • Nick
    Posted September 13, 2010 at 2:59 pm 0Likes

    Robert thats a great analogy. 🙂

    Craig, GA and other javascript based ones cannot see crawlers at all because crawlers cant process javascript. Only actual browsers count with them.

  • Glenn Batten
    Posted September 13, 2010 at 5:55 pm 0Likes

    What about stats that are just made up by the pr/marketing departments?? Lets not forget the term “Property Seekers” which the PR person at Realestate.com.au said was created to dumb down the stats yet the Neilsens rep came on to the site and said

    “I can

  • PaulD
    Posted September 13, 2010 at 7:29 pm 0Likes

    Glenn, at 6.0 mill – that represents approx 36% of the Australian public of “buyable” age. All the numbers that I’ve ever seen, would say 36% of the public actively looking for real estate at any time, would be 7 or 8 (perhaps more) times the real number. At 1.9 mill – that still represents over 11% which is still high, but much more believable, because you get a crossover of people who are just starting to look with people who have been looking for some time and are just about to buy. The amazing thing was that REA used to call them “Property Seekers” – all 6 million of them. I think I saw somewhere, where there were between 400,000 and 500,000 residential properties sold each year in this country. If there were 1.9 million people looking, then at that rate there would only be around 25% of them that ended up buying a property.

    Based on the Jun quarter in NSW, there were over 52,000 houses and units sold. That represents around 200,000 annualised, and it is not hard to see that the total number will be around the 450,000- 500,000 in Australia

    I can’t recall crowds of people getting agitated because they were one of the 75% who failed to be able to buy a property each year. If that was the case, I am sure you would see some reporting of that fact. The numbers are fairly compelling and support you argument, whereas at the other end of the scale – not the slightest chance !!!!

  • Glenn Batten
    Posted September 13, 2010 at 8:47 pm 0Likes

    PaulD

    Remember that 1.9 million is a monthly figure so it should relate to just a monthly sales number…. so all those people are looking at just 50,000 sales.

    That means that only under 3% of all those people looking this month will actually buy. Next month there is 1.9 million looking again and just 50,000 sales.

    Now many of the 1.9 million looking in Jan will not be in the 1.9 million looking in September.. but we have no idea how many are yearly uniques.. The other thing to consider is that 1.9 million will also include rentals. So whilst we dont have enough info to state yearly figures accurately I do have enough to know IMHO that realestate.com.au Property Seeker Stats are full of rubbish.

  • PaulD
    Posted September 14, 2010 at 8:59 am 0Likes

    Yes Glenn, I have been thinking that since we started using REA in 2003.
    It still amuses when we have 3000 people get “engaged” in a listing and NONE of them make an enquiry, and as I have mentioned before, in 2003 it was almost 2% enquiry rate, (ie 20 per thousand) it is now barely 1 per thousand.

  • IT Consulting
    Posted September 14, 2010 at 9:41 am 0Likes

    In my opinion Google analytics is the best tool for measuring traffic as it helped to create a more effective site and increase ROI on marketing campaigns for my company.

  • Greg Vincent
    Posted September 14, 2010 at 12:47 pm 0Likes

    Glenn, it’s interesting to see that lj hooker and ray white were 2 keywords that generated the lots of traffic to their site, yet realestate.com.au didn’t appear on Page 1 within the organic section of a search on either of these keywords. It looks like it’s all coming from adwords.

  • Shane Dale
    Posted September 14, 2010 at 9:51 pm 0Likes

    Frankly the only statistic that counts is the end sale result – i.e how many sales you can make from the effort.

    The numbers are useful mainly for measuring yourself, as a consistent metric, so you can see if you are going up or down – the actual numbers are always vague, I have never relied on them as being real people. Its simply proportional, if I double UBs this month – I assume I have doubled traffic, if all measuring systems are the same as last month.

    The question remains – would you rather have a million UBs with 1000 enquiries converted to 5 sales or 100 UBs with 50 enquiries and 10 sales?

    The issue is that REA charged for a long time were premised on traffic counts alone, when they should have been focused on enquiry per property per suburb. Even if they didn’t charge for it, as a per enquiry they should be focused on that figure. Its the only meaningful one.

    That metric would easily show the agents what is effective and what isn’t.

    But having said that – getting alot of enquiries but no sales could mean you simply haven’t put enough information about the property – so buyers are forced to contact you – and after getting the enquiries the agent has wasted them by being overpriced or having a bad listing.

  • Charlie Gunningham
    Posted September 14, 2010 at 10:30 pm 0Likes

    So is the conclusion that we don’t really know? And shouldn’t care less? We can’t measure accurately – no one has come up with an accurate measurement ?

    I certainly am sceptical on all web stats; and EXTREMELY sceptical of the big 2 sites using selected published “statlets” (the bits they like) and bolding them up into email blasts and ramming it down everyone’s throats like it’s Gospel. (There’s an underlying insecurity here that explains why they have to behave in this way… but that’s another story.)

    So that’s why I prefer to ‘triangulate’ (like all good mariners, get my info from 3 points) – but if these contradict, where are we? And shouldn’t we know?

    Shane – 95% agree – the more important stat is sales/profit, etc; but I still think after 20 years of the internet we should be able to measure visits to a web site with confidence, and have one metric the industry agrees on, and we all use.

    I will continue to be frustrated!

  • Bill
    Posted September 19, 2010 at 8:06 pm 0Likes

    Stats to one side for a moment if I may, I would like to know how clients perceive the REA and Domain brand. Do your clients request or demand to be listed on particular portals. I’m a non agent and would be interested in knowing why agents list properties on a particular portal, is it because they work or to cater to client perception.

  • Sal Espro
    Posted September 21, 2010 at 8:55 am 0Likes

    Great question, Bill. Our experience is that we ‘construct’ the advertising and lead the client to it and act as the marketing expert. The client may mention either portal in discussions but there are very few who demand we go with one or the other or we won’t get their business.
    What do you say up in Mosman Robert where you only use Domain from what I gather?

    (Ps I still haven’t heard of any client demanding REView).

  • Vic
    Posted September 21, 2010 at 5:41 pm 0Likes

    Bill and Salespro,

    Not the subject raised by Charlie…but, this subject has the potential to thread to 200 and beat “You be the judge”.

    As a new portal owner I would love to get the perspective of other agents and also from consumers/sellers as to how a property gets chosen to go on a portal.
    Also what would be of interest is to know whether agents are proactive in promoting a “marketing package” which includes multiple listings. With the plethora of free to list and niche sites emerging, are agents using this as a prime selling method to gain new listings?

    For example I have some agents who have told me that they use our portal as a means of gaining listings for sellers who have properties on or near the water. Clearly they still have their accounts with REA or Domain or both, but are more and more looking to gain the greatest exposure for as little extra cost (or no extra cost) to the potentential vendor, by going on as many free to list and niche portals as they can get their hands on. As the market tightens up it seems a smart way to go.

Leave a comment

3 minute read
NetPoint Group