The Nielsen Families. A group of selected households (mostly in New England I believe) who are used by Nielsen (the rating’s folk) to give a picture of what the U.S. nation watches on television. Long thought to be a less than random selection of households, some would say a chosen few, the watching habits of these families influences what TV programmes get made, where the advertising dollars get spent and what series gets a second run. Now Nielsen have announced (via Media Post) their intention to start monitoring the same families internet usage. They want to provide a single source measurement of television viewing across both traditional TV and online media which makes sense.

What they watch isn’t so interesting to me. I’d rather hear about how their watching habits differ between TV and online, whether the knowledge that their internet was now being monitored changed their habits (maybe moving them to watch more TV on the web and thus biasing Nielsen’s data) or whether it’s the younger or older viewer who likes to watch TV online. That kind of demographic data on user behavior is much more interesting than what soap opera they watch… Still at least they aren’t measuring their internet usage and they still do that at ISP level where you get broader, more representative results (of course depending on the ISP’s selected), at least I hope they aren’t.

Surely there must be a better way to measure television consumption in these digital days? Can’t they just measure TV through cable, digital and satellite channels at the provider end thus giving them a totally unbiased (depending on the provider) view of a large segment of the population’s television viewing? If decisions on quality/popularity of programming, who to place adverts with and who deserves a second series are being made by TV execs then surely the opinion of 600 homes in New England is not representative enough?

The best, most salient opinion piece on the Nielsen Families comes from one of my favourite authors Robert Anton Wilson have a read of page 59/60 below (just search for Nielsen in the Scribd doc) just for jokes 🙂

View this document on Scribd

I’ve never really looked at the bit.ly stats that you get for every link you shorten using the service (handy tip, add a + to the end of any shortened bit.ly URL to see the stats). So this evening I did a little experiment. I had a need to create four shortened links each directing users to a different page. Each page has Google Analytics tracking installed and also another well known analytics package (not free). To each link I also added a click tracking parameter for the ‘not free’ analytics package. All four links were distributed only via tweets on a Twitter account (not my personal one, and not a well followed one which is why the numbers are low, but it suited the purpose) using the Twitter web interface. So at the end of this little test, I should have three sets of figures. One from bit.ly, another from Google Analytics (counting referrals from Twitter to each page) and a third from the other analytics package.

The results:

bit.ly Google Analytics Other analytics
Link 1 5 10 11
Link 2 8 16 18
Link 3 2 12 12
Link 4 1 20 24

There’s nothing particularly scientific about this, I just checked out the bit.ly stats and noticed the discrepancy. Both Google Analytics and the other stats package use cookies to track users while bit.ly (I assume) can count directly any clicks on its shortened URLs on its servers. There may be a good reason for this difference, perhaps bit.ly has been having some server issues today, but the difference is enough to make me think twice about trusting bit.ly.

I’ve always used bit.ly for shortening links, for no reason other than it’s conveniently embedded in tools such as Tweetdeck, but I may consider trying another one of the multitude of services that do the same job. Obviously bit.ly shouldn’t be relied on for your main traffic analytics, it’s not their main business, but should we expect its stats to be more accurate (or at least closer to other analytics packages)? These are low traffic links but on high traffic ones how wide would the gap be then? Would be interested to hear what others think, especially if you know of a reason for the difference.

Hat tip to Josh at Read Write Web for his write up regarding this link that I’d never come across before. It’s a demo of an analytics tool aimed at web 2.0 and AJAX websites.

With the death of the page view as the all important metric of the analyst there has appeared a need to be able to measure users engagement with a website rather than just how many pages they viewed.

The rise of AJAX has been a major player in this with whole websites sometimes being a single screen which makes many calls to databases and servers in order to refresh itself multiple times in a users visit. Thus devaluing the page view completely.

The demo shows a novel way to gauge a users engagement by measuring in time how long segments of the page stay in the browser viewing pane. This isn’t perfect by a long way but it’s a sign of how analytics tools will have to work in the future as websites get more difficult to measure and marketeers and management get more demanding in their hunt for data to help understand their users.

Also really interesting is the demo of a tool to measure users engagement with a banner advert. I can’t wait till metrics like this exist as they may help marketers see that throwing money into display advertising is not the way forward anymore.

What I’d really like to see is mouse interaction data on pages as well. It surely is possible to collect the data on the X and Y coordinates and it’s a good hint as to what area of the screen a user is actually focused on (users tend to hover the mouse over what interests them). It’s great to know that the item you’re interested in is within view but how do you know that users are actually looking at it? Short of installing eyetracking as defacto in PC’s we may never answer questions like that!

Compete.com have released some stats listing the top 50 website domains by unique visitors on their blog today. It makes for some interesting reading:
Yahoo is still the biggest domain in terms of unique users. Not surprising given their huge coverage, surely they have to come up with a way to make a success of all these eyeballs? They may lose out in search to Google but with such a vast web real estate finding a way to leverage that is key for them. Google however coming second is amazing considering their core is still search!

Facebook at number 21 is a bit of a surprise, I’d assumed they’d be higher given the buzz but perhaps they’ll position much higher next year (if their bubble doesn’t burst).

The growth figures in the blog post are most intersting, showing sites such as YouTube, Flickr and Digg as some of the biggest gainers (bigger even than Facebook). This certainly is the time of sharing content, something Facebook has yet to get right (they started off well but it’s lately disolved into MySpace’esque profile vanity).

Adult dating still a major growth area it would seem; the person who launches a Facebook for this domain will win big!

Of the losers, most intersting for me is the losses experienced by Expedia. This can only be down to the emergence of much better sites that give users more intuitive ways to search for flight & hotel availability. Online travel is much more competitive in that arena this year and with the move from tour operators to embrace dynamic packaging I can only see Expedia losing more eyeballs if they don’t make some significant functionality changes soon.

Neilsen/NetRatings has announced it will lower it’s weighting given to rankings based on the longtime industry yardstick of page views and begin tracking how long visitors spend at websites. It has added both ‘Total Minutes’ and ‘Total Sessions’ metrics to NetView, its syndicated Internet audience measurement service.

This is to counter the problems their measurements encouter with rich applications using AJAX etc which allow users to interact with websites and services without reloading pages.

This has been coming for a while and I look forward to seeing the shake up in the rankings based on their metrics. I can think of a few sites that should benefit (due to users interacting with single pages for some time) and I can also think of a few sites that should suffer (due to users clicking around blindly and then leaving).

The new metrics

June 5, 2007

Nielsen Ratings is soon to release it’s new metrics for websites. This is in response to the concern over using just pageviews (which has been the metric of choice for advertisers up to now). With developments such as AJAX, pageviews are not as relevant as they once were.

‘Total’ web statistics will include measures such as the total number of minutes users spend on a site, pageviews and more.

I’m hoping this will also include interaction with web pages as that is the only true way (that I can think of) to measure the usage of AJAX applications where the screen refreshes without loading a new page.