« January 2007 | Main | March 2007 »

February 27, 2007

The mysteries of measuring marketing response, part 2: Landing pages

For part 2 of this n-part series on marketing measurement techniques, we're turning our attention to the methods employed by web analytics tools to capture the source of an inbound click and turn that into a report about whether marketing is working for you.

Bye bye, Referrer

Back in simpler, more innocent times (i.e. up until about five years ago), you could get a pretty good idea of where your traffic was coming from by looking at Referrer data in your web analytics tool (for some reason lost in the mists of time, web browsers always (or always used to) report the previous URL they were looking at whenever they request a new URL, the previous URL being known as the "Referrer"). This information can still be interesting to look at, but its quality has degraded terribly over the years, for a variety of reasons, the main ones being:

  • Many browsers now block sending Referrer information, considering it an invasion of privacy
  • Certain types of server redirect don't pass on Referrer information
  • Many marketing systems redirect traffic through gobbledygook URLs from which you can't extract any useful information

Hello, landing pages

So Referrer data has really fallen by the wayside and has been largely replaced by a new technique, known as landing pages. The principle behind this method is that you create a unique page on your site for each marketing campaign you're running - or even for each element of each campaign. The key thing to ensure is that only your marketing directs traffic to those pages - they're not linked from anywhere else either inside or outside your site. So when you come to analyze your traffic data, you know that if you see page views for those pages, people must have come to your site via the marketing you're doing.

It's not as onerous as it sounds to create unique landing pages for each campaign you're running, because it's actually only the URL for the page that has to be unique, not the page itself. Almost all web servers are perfectly happy for you to append dummy parameters to the end of a URL (as long as you still have a valid URL from a syntax point of view) and will ignore the parameters they don't recognize.

For example, the URL for the home page for mirrormirror (my wife's e-commerce site) is

www.mirrormirrorontheweb.co.uk/main.asp

but it's perfectly valid to include a dummy 'src' parameter, as below:

www.mirrormirrorontheweb.co.uk/main.asp?src=iansblog

Clicking on either link above will take you to the home page. But when the data is analyzed, the "src=iansblog" part will identify the clicks that came from this blog.

It's perfectly permissible to add more than one dummy URL parameter to a landing page to identify more than one attribute of a campaign, as in the following example:

www.[site].com/main.asp?src=search&pub=google&kg=widgets&kw=blue+widgets

Here, the src, pub, kg and kw parameters identify the source (Search), publisher (Google), keyword group ("Widgets") and keyword ("blue widgets") of the particular click in question. Which parameters you choose is up to you, though your web analytics tool may specify that it will only extract parameters with certain names.

If you have free rein over which parameters to add, though, how do you choose? That's where you need a taxonomy.

Taxonomy, schmaxonomy

Once  you get the hang of 'tagging' your landing page URLs, you can apply the principle to all the marketing you're doing - at least, all the marketing where you have control over the landing page URLs (some notable exceptions are organic search and affiliate marketing). If you apply the dummy parameters in a consistent hierarchy structure, or taxonomy, you can then compare the performance of different elements of your overall marketing mix side by side much more easily.

Let's use an example to illustrate. Say you're doing paid search marketing, e-mail marketing, and are running some banner ads. You want to create a categorization hierarchy (the taxonomy) that you can use to organize all the elements of these marketing channels. So you might use the following hierarchy:

Channel
      Campaign
            Placement

"Channel" refers to the marketing channel - in this case, Search, E-mail, or Display Ads.

"Campaign" refers to a grouping of marketing activity, such as a collection of keywords on a particular search engine, or a particular e-mail run-out, or a banner campaign. 

"Placement" is an online ad industry term that refers, broadly, to the location of the ad. But it can be applied to describe the "location" of any clickable marketing link, such as a link within an e-mail, or a particular search keyword.

So the trick is to pick values for these categories which make sense across the different kinds of marketing you're doing. In our example, a Search taxonomy might be (the parts in square brackets are just to remind you which bit of the taxonomy is which):

Search [channel]
      Google general widgets [campaign]
            widgets [placement]

Here you can see that the placement is really the keyword. For E-mail, the structure would look like:

E-mail [channel]
      Widget promotion mail 2-27-07 [campaign]
            Blue widget picture link [placement]

Finally, the structure would look like this when applied to display ads:

Display Ads [channel]
      Spring widget promotion - hobbyist sites [campaign]
            youandyourwidget.com homepage 468x60 [placement]

Because the categorization is used consistently across the different types of marketing, you can now compare these channels, campaigns or individual placements against one another in a meaningful way.

Of course, you could add at least one or even two more layers to this hierarchy (four levels in total seems to be a generally useful number), but the more you have, the more onerous your instrumentation task is going to be.

Generating an overarching taxonomy for your marketing can be a little challenging, due to the diverse nature of the different marketing channels, but it is worth it. Some web analytics tools make it a bit easier by allowing you to define the taxonomy within the tool (usually not to more than one or two levels) and then generating the dummy landing page parameters for you (it's still up to you to put them into your marketing click-through URLs, mind you). But many web analytics tools fight shy of enforcing an overarching taxonomy, introducing channel-specific categories (such as keyword, or ad creative size) at the lower levels. This makes those tools more usable (certainly not to be sniffed at), but it does make multi-channel comparative marketing analysis more difficult.

More to come...

That's enough for this week. In the next installment, we'll look at the methods web analytics tools use to allocate marketing response to conversion, comparing and contrasting in-session conversion allocation with multi-session conversion allocation.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

February 26, 2007

Pinged

The inestimable Avinash Kaushik has done me the favor of sending more traffic the way of my blog, by excerpting some answers he gave to Jeff Lawrence of Sonicko Consulting in a recent interview on Jeff's blog. In his answers he lists four things he'd like to see in Gatineau, our forthcoming web analytics tool.

It's a little early in the development process for me to comment directly on Avinash's wishlist, but it's nice (and a little nerve-wracking) to see that anticipation is already building for our entry into this market. Hopefully I should be able to reveal a little more about our plans fairly soon. But in the meantime, if there's anything you'd like to see in the tool, feel free to leave a comment, or contact me through my About page. As Avinash says in his post, our plans are already pretty well advanced; but you will be able to influence the product as it evolves.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

February 19, 2007

The mysteries of measuring marketing response, part 1: Delivery system-based counting

It goes without saying that measuring the effectiveness of online marketing is web analytics' #1 'killer app'. But how realistic a picture of the value of online marketing can a web analytics deliver? Come to that, is there any such thing as a true picture of marketing effectiveness?

The short answer to the above question is no. Depending on the measurement system you use, and the counting/reconciliation methodology you use, you can get pretty much any picture of marketing response that you want - and plenty you don't. Today's post is the first of a series which will combine to provide a short(ish) field guide to the more common counting methodologies you'll find. Ask your vendor which one they use, and why.

 

Delivery system-based counting

The simplest way to measure the impact of your online marketing is to let the system that's delivering the marketing do it for you. Examples of such systems include Google Adwords for paid search, Atlas for online ad-serving, or Constant Contact for e-mail.

Technically, this solution usually involves a 'click redirect'; when the user clicks on a banner ad, or a paid search link, or a link in an e-mail, their browser is actually directed to a long and complicated URL on a redirection server, which automatically redirects them to the actual destination URL, but not before making a note of the fact (i.e. recording the click).

Since they're also delivering the marketing (i.e. showing the ads, or sending the e-mails), these systems can also report on how many times the ad was shown or the e-mail sent, and also the reach of the marketing, i.e. how many people saw it in a given time period. They can also report on how much it cost; indeed, these measurement systems are used in the billing systems of pay-per-click networks like Google Adwords.

A key enhancement to this method of counting is to capture 'events' (usually a specific page being requested) on the 'destination' website (i.e. your website) and correlate these back to the original marketing. The method used here is to place tag code (sometimes known as a 'spotlight' tag, a term coined by DoubleClick) on key pages on the destination site which send information about the fact that (for example) a purchase was made back to the marketing system. In an advanced version of this, the value of purchases can be sent back.

The 'conversion' event is linked back to the original ad delivery/click by means of a third-party cookie, and correlated over some kind of time window, such as 30 days (i.e. if a conversion event occurs within 30 days of a click from the same user, that conversion is allocated to the bit of marketing that drove the click).

So a full implementation of this kind of counting system could yield the following information in a report:

  Impressions Clicks Cost Purchases (#) Purchases ($) ROI (%)¹
Paid Search 1,000,000 10,000 $10,000 200 $40,000 400%

¹ This ROI figure doesn't take into account the cost of the good sold, so isn't a true ROI, but is the closes that most such systems get.

Limitations/shortcomings

The main limitation of this method of counting springs from the same source as its strength: it is delivery system-centric. So if so if you're using, say, three different kinds of marketing (as in the example above), you'll get three different sets of reports on how it's working, which you'll have to compare yourself to get a picture of what marketing is working best (easier said than done).

This task is made even harder by the fact that each system wants to claim as many of your site's conversions as being caused by their marketing as they can. This leads to multiple systems claiming credit for the same conversion.

To understand how this happens, consider the following example: a user clicks on a paid search ad, and goes to a site, where they sign up for a newsletter. Two weeks later, they receive the newsletter, click on one of the links, and spend $1,000 on the site. Because the conversion is within 30 days of the original paid search ad click, the paid search system claims credit for the conversion; but because the conversion also occurred shortly after a click on an e-mail link, the e-mail system claims credit too.

Who to believe? Clearly both elements had some impact on the propensity to convert, but neither individual system is going to admit that, because that would mean giving away some of the value of the conversion, and reporting a lower ROI.

You can't solve this problem with delivery system reporting - you have to use web analytics on your site itself to solve this. We'll be exploring this thread in more detail in the next couple of posts in this series.

Another limitation of this counting system is that the number of clicks reported by the delivery system is always higher (usually by about 10%) than the number of inbound arrivals at the destination site. The reason for this is that the Internet is an unreliable place, and so are users' computers; between clicking the link and arriving at the destination site there are a whole bunch of things that can go wrong, such as the user's Internet connection going down, or the user (from where they're sitting on the Internet) just not being able to see the destination site. So the delivery system measures the click, but the redirection never winds up sending the user to the destination site. So yes, if you're paying per click for ads, you're overpaying by about 10%. But so is everyone else, so get over it.\

Finally, this kind of system is vulnerable to the vicissitudes of third-party cookies, which are hardly the most popular kid on the block these days. If the users flushes his or her cookies between their original ad click and when they actually convert, their conversion cannot be correlated back to their click.

Whether to trust the data

The net of this is that you can trust the delivery (impressions) and click information in a report from your marketing delivery system vendor, but you should take the rest with a healthy pinch of salt. Conversion counts in particular will be over-estimated; you should probably discount these figures by around 20%, though this figure depends entirely on the mix of marketing that you're doing (if you are only doing one kind of marketing, the figures will be more accurate).

If you have a web analytics solution deployed against your website, make sure it's measuring marketing response too (more on this in the next post in this series), and compare the two to get something of a reality check.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

February 15, 2007

One bad apple

A colleague brought to my attention the dubious practices of LogStats.de, a German provider of free web analytics. LogStats is a typical teeny-tiny provider of free web stats, using a JavaScript-based tag for data collection. Free web stats is a pretty thin business to be in these days, what with behemoths like Google and us charging about (or about to charge about, in our case) in the market - so how does LogStats pay the bills?

It turns out that the HTML code segment that LogStats distributes contains a little something extra. Can you spot what it is in the code below? (thanks to Google Blogoscoped for this code):

<!-- Logstats Counter Code -->
<script language="JavaScript" type="text/javascript" src="http://www.logstats.de/pphlogger.js.php?id=...">
</script>
<noscript>
<img src="http://www.logstats.de/pphlogger.php?id=...">
<a href="http://www.artelight.de">Leuchten</a>
</noscript>
<!-- Logstats Counter Code -->

Don't see anything unusual? Go to the back of the class. What, precisely, is that link on the word "Leuchten" (German for "Lamps") doing in the <noscript> section? Well, the website linked to - Artelight.de - is owned by the same guy, Marcin Nolte, who owns LogStats.de. So everyone who implements this tag code is giving Artelight a free link - on every page.

That's going to be pretty good for Artelight's Google rankings, and indeed they rank #1 in Germany for the term "Leuchten" and "Lampen" (another word for "Lamps"). Logstats claims to have about 9,500 customers, so that's a lot of back-links. But it's pretty sneaky.

You could argue that  Logstats/Artelight are doing nothing more evil than gaming Google's page rank algorithm, and all power to them. After all, apart from consuming a tiny amount of extra bandwitdth on their clients' sites, neither their clients nor their customers are coming to any harm whatsoever. And you could argue that these companies need to get something back for providing a free web analytics package.

But in an era when web analytics and online marketing are viewed with considerable suspicion, this kind of behavior is unhelpful, to say the least. There are rumors that other small web analytics firms are engaged in this practice, too, which is also rather worrying (the only one I've been able to confirm is blogcounter.de which seems to do something similar). The problem with this kind of thing is that it is grist to the mill for anyone who wants to throw mud at the online marketing and web analytics industries and paint them as enemies of privacy. One bad apple spoils the whole damned barrel.

[Thanks again to Google Blogoscoped for much of the detail of this post]

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

TouchClarity falls into the gaping Omniture maw

Touch Clarity logoOk, forgive the hyperbolic headline, But Omniture's proving insatiable of late. Last month it gobbled up Instadia; yesterday it announced it was acquiring Touch Clarity, UK-based provider of behaviorally-targeted content delivery systems.

In contrast to the Instadia acquisition, Omniture's press release on the Touch Clarity deal focuses on the complementary nature of the two companies' offerings. Touch Clarity is already a partner in Omniture's Genesis program, using their universal tag for data collection, so this seems like a no-brainer - the automatic content optimization offered by Touch Clarity will take a lot of the leg-work out of turning web analytics insights into changes to content.

Touch Clarity customers can expect to receive sales calls from Omniture sales folk; but more than this,  Omniture customers (particularly publishers) will be being encouraged to add Touch Clarity's services to the portfolio of stuff they're getting from Omniture.

Anyway, congratulations to the folks at Touch Clarity. I met them several times before I left London, and they were always doing great work.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

February 12, 2007

What's a third-party cookie?

You might imagine that after seven years in the web analytics industry I would have worked out what a third-party cookie was. But it turns out that my thinking on this is fuzzy (like so much in my life), or at least incomplete. Let me explain.

When asked what a third-party cookie is, most people will say something along the lines of the Wikipedia definition:

“Images or other objects contained in a Web page may reside in servers different from the one holding the page. In order to show such a page, the browser downloads all these objects, possibly receiving cookies. These cookies are called third-party cookies if the server sending them is located outside the domain of the Web page.”

So far, so good. But there's an edge case, of interest to a small number of relatively influential companies (that is, Microsoft, Google, Yahoo! and a few others) which raises a question mark over this definition. This is the case where the cookie in question was originally set as a first-party cookie (e.g. from google.com), but is subsequently read in a 'third-party' context.

The reason that this would happen is that the owner of the cookie might be using that cookie as a key to behavior or profile data; and they might make a partnership with a third-party site, for example to serve advertising into. They might want to read the cookie of a user visiting that third-party site in order to serve him or her targeted ads (or even do more 'benign' things like frequency capping).

So at this point, is the cookie in question a third-party cookie? The language in the Wikipedia entry would seem to indicate not. But if not, what sort of cookie is it? A couple of other definitions seem to corroborate the Wikipedia definition:

"Third-party cookies are created by a Web site other than the one you are currently visiting; for example, by a third-party advertiser on that site" - Computing Dictionary

"Third-party cookies come from other websites' advertisements (such as pop-up or banner ads) on the website that you're viewing. Websites might use these cookies to track your web use for marketing purposes" - Internet Explorer 7 help

But then a widely-quoted definition from, ahem, us, takes a different tack:

"A third-party cookie either originates on or is sent to a Web site different from the one you are currently viewing" - Microsoft Windows XP Product Documentation

Now you might think this is just so much cookie-related navel-gazing. But the NAI is currently in the process of putting together some 'best practice' guidelines for the use of cookies, and the definition of first-party vs. third-party cookies makes a big difference to the obligations imposed upon signatories to the guidelines.

The edge-case only really applies to companies who can build up a significant base of first-party cookie relationships with users and who are then in a position to leverage this base with third-parties - hence the list of big sites mentioned earlier. But I think it raises an interesting question about portability of identity - is it better for users to have their Google/MSN/Yahoo IDs re-used on third-party sites for profiling, or for entirely unknown third-party networks (say, Atlas or DoubleClick) to be aggregating this data? At least with the former case the user has heard of the organization in question. What do you think?

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

February 08, 2007

Another thing I wanted to ask you...

Further to my previous poll question, here's another. Overlay reports: an innovative way of visualizing traffic data, or pointless eye-candy with no analytical usefulness? You be the judge. Results once I have enough responses not to be too embarrassed.


Create polls and vote for free. dPolls.com

 If you have other thoughts about overlay reports, I'd love to hear them - comment away.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

And the winner is...

My fragile ego is salved by the news that a full sixty-two people responded to my poll question about whether power or simplicity was more important in a web analytics product. Here are the results:


Create polls and vote for free. dPolls.com

So the power-hungry outnumbered the simpletons by nearly two to one. I think this reflects the relatively sophisticated nature of the readership of this blog.

The balance between these two forces is something that is exercising our minds here; too much power (at the expense of simplicity) and we risk alienating (or at least confusing) the very audience that Gatineau is aimed at; but you need some analytical oomph to extract the really useful learnings.

Also, if you have a bit more depth to your product, you can encourage third parties like consultants and SEM agencies to invest their own time to learn the tool and use it to benefit their clients; if the software is so darn simple that a monkey could use it, there's no niche for them to occupy. Oh, it's all so confusing.

See the next post for a new poll question.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

February 02, 2007

What do small businesses need from web analytics?

This is not my wife If you read this blog regularly, you may know that my wife runs a online retail business in the UK called mirrormirror (since Valentine's day is coming up, and I know that the overwhelming majority of my readers are male, this may be a good time to go over there and earn some brownie points with your loved one; but that's not the point of this post). I help with various aspects of the business in my free time - specifically, anything involving numbers or HTML tags.

One of the reasons I'm happy to help out (as I sit hacking away at HTML in the wee small hours) is because mirrormirror is slap-bang in the middle of the target market for our 'Gatineau' web analytics service. So whenever I'm thinking about whether a product feature would be useful for our audience, I think not only, "would mirrormirror benefit from this?" but also "could mirrormirror benefit from this?"

The latter question is a very useful filter. On several occasions since arriving here, I've had conversations along the lines of:

Me: "So how are our customers going to instrument their sites?"
Interlocutor: "They'll get their IT department to do it."
Me: "These people don't have IT departments, dummy."

So I thought I'd share some of the rules-of-thumb that I use when thinking about small businesses (I used to work for one as my day-job, too, so I feel moderately qualified to generalize) that are guiding my thinking about the feature set for Gatineau.

 

1. No money
Small businesses have no money. Well, ok, maybe they do have some money, but it's already spoken for several times over. Spending money on measuring stuff comes at the bottom of a long list which is headed by items like paying the rent, employees' salaries, paying other suppliers of absolutely essential stuff (e.g. stock), taxes, and so on. Towards the bottom of the list are 'marketing' and 'making changes to the website'. Measuring the effectiveness of the marketing, or the changes to the website, comes even lower.

That's why, of course, a free web analytics tool is so appealing. But actually, having no money is the least of our worries. No, the real problem for small businesses is no time.

2. No time
Small businesses have no time. It's basically the same problem as money - what time there is is carved up between a number of important things like finding customers, selling them stuff, delivering the stuff to them, and doing admin.

But time is more difficult to carve up than money. If you say to a small business owner, "this will only cost you $50 a month", they may be able to slice off a sliver of their earnings for that month and give you $50. But it's not so simple to say, "this will only take 5 minutes of your time a month", because you can't carve time up that small, unless the thing you're asking the person to do for 5 minutes a month is so darn simple that they can remember how to do it in the intervening 30 days, 23 hours and 55 minutes.

Unfortunately, web analytics doesn't fall into this category of tasks - it doesn't even come close. So actually, having no time is not the real problem. No, the real problem is no expertise.

3. No expertise
My wife is (as I am fond of reminding her) extremely lucky - she lives with one of the world's foremost web analytics authorities, so she benefits from a level of expertise in this field that others can only dream of. But most small businesses are not that lucky. It's very hard to maintain perspective on just how hard (or at least, unfamiliar) web analytics is to someone who doesn't think about it every day.

The lack of time is, of course, a root cause of the lack of expertise, but another thing that we data geeks need to bear in mind is that there are some people in the world who (whisper it) don't find this stuff as fascinating as we do. So whilst my wife is keen to learn how people clicked through the latest e-mail campaign (and will prod me with a stick until I laboriously add all the relevant tracking tags to the e-mail HTML), she's less interested in the growth of Firefox vs Internet Explorer - at least, until it becomes apparent that there's a Firefox-only bug with the site, when the fact that 20% of the site's visitors are using it suddenly becomes terribly important.

 

So what does this mean for any aspiring multinational corporations who are hoping to give web analytics away for free to this market? To me, it means the following:

Keep it outcome-focused
Poverty-stricken, rushed-off-their-feet, stupid small business owners need technology to deliver them answers in a context that they can understand. So web analytics needs to mesh with the real-world concerns and needs that these people have. Top amongst these is understanding how marketing dollars could be better spent. Because so many small businesses now use paid search advertising, there's a growing awareness of some of the 'traffic' concepts around this activity, such as impressions and clicks (though we should not assume too much). Web analytics for small businesses should grow out of this context, rather than provide their own unique context and expect small businesses to learn that context and, more importantly, remember the context month-to-month.

Make it easy
I don't just mean the tool, here. I'm referring to the whole process of signing up, getting set up, looking at the data and drawing conclusions. Technology will help in some areas - for example, by automating instrumentation - but the rest is a people issue. Not only does the system have to be easy, it has to stay easy - you need to assume that the customer will climb the product learning curve almost every time they use the product. If that learning curve is steep, or even contains just one counter-intuitive step, the customer will get pissed off climbing it every time they sit down in front of the tool, and they'll stop using it.

Provide people
"Web analytics needs people" is such an old saw now that I even hesitate to mention it, but in the context of small businesses it means access to expertise in other, approachable, organizations who specialize in web analytics but don't say they specialize in web analytics - or at least, don't define themselves this way. Small businesses (and many big ones) aren't interested in web analytics - they're interested in selling more stuff online, paying less for advertising, or selling more ads with their content. Web analytics is just a means to an end. The sorts of people that small businesses need access to are people who understand how to use web analytics to help their customer achieve their goals. These people need to be almost as suspicious about web analytics as the customers they're serving, and understand that it only provides a piece of the picture.

Google has done a great job (whether by luck or by judgement, I don't know) in building an ecosystem of small consultancies who offer services around its product. Given the enthusiastic response I've had from this community to the news of Gatineau's impending availability, I'm hopeful we can do the same.

 

That's probably enough ranting for one blog post. In future posts, I hope to give you some more concrete examples of the measurement challenges that we've faced at mirrormirror. If you deal regularly with small businesses for web analytics, I'd love to hear from you - please either mail me through the 'about' page of this blog, or leave a comment in the comments for this post.

del.icio.usdel.icio.us diggDigg RedditReddit StumbleUponStumbleUpon

About

About me

Disclaimer

Subscribe

Enter your email address:

Delivered by FeedBurner

Subscribe