[ About Us | Popular | Marcom | AdNet | IChannel | Glossary ]

Sep 2, 2008

Do you always get massive discrepancies between the adserving and the web analytics stats when doing a post campaign overview ?

Do you always get massive discrepancies between the adserving and the web analytics stats when doing a post campaign overview ?

posted 10 hours ago in Internet Marketing  | Flag question as...

Share This

You have answered this question. View your answer.

Answers (7)

Brenda Castelo Radtke is a 3rd degree contact

Brenda Castelo Radtke

Manager, Interactive Strategy at Ignited

see all my answers

There is almost always a discrepancy between ad serving and web analytics because they are two different platforms that measure data in different ways. In my experience, I've often seen the web analytics report lower numbers than the ad server's numbers. Discrepancies are okay if they are consistently inconsistent. It helps to know how an ad server is "defining" a set of data vs. your web analytics platform. For example, a "visit" may be defined differently by DART (clicks) vs. Omniture (visits to an actual web page). In this example, it would be possible for a user to click on an ad a few times and repeatedly visit your site, but the web platform could count this as one visit. 

Discrepancies are okay if they are consistently inconsistent (same margin of difference across the board). If not, then you'll need to dig a little deeper to resolve the issue. Some errors could also be due to caching, latency, or cookie issues on the server side - and how your web analytics platform tracks them. Depending on what ad server or analytics software you are using, you should be able to look up cited leading causes of data discrepancies. Here's a good one from Google to give you an idea - I looked this up when I was trying to target some marginal differences between AdWords report and Omniture data http://adwords.google.com/support/bin/answer.py?hl=en&answer=74438 

Lastly, discrepancies could be due to coding errors on the website. Quality assurance methods for web analytics tags seem to be laborious and inefficient - and it's probably easy to miss something especially if you have a robust site. Here's a link to a really cool QA tool: http://wasp.immeria.net/ 

Good luck!

posted 9 hours ago | Flag answer as...

Mark Garner is a 3rd degree contact

Mark Garner

Ecommerce Marketing Specialist

see all my answers

Best Answers in:Internet Marketing (2)... see more


like Brenda says you will always get discrepencies, it's just the nature of the technologies at the moment. 

The key measures is did you sell more when the campaign was running? 

How much more? 

Was it profitable? How profitable? 

If you run campaigns over a period of time you will begin to see patterns and correlations, and it is these you should monitor and compare. IE: How did this month's campaign/sales results compare to last months. 

The only analytic that matters is how many dollars came in. 

All interpretation and analysis of stats should be done with that in mid.

posted 9 hours ago | Flag answer as...

Steve Reeves is a 3rd degree contact

Steve Reeves

Owner, Front Office Box

see all my answers

Yes- I just don't believe any of it with consistently different reports from Adsense, Analytics, Clicky and actual experience.

posted 7 hours ago | Flag answer as...

Michael Seidle is a 2nd degree contact

Michael Seidle

mike@seidle.net - CEO Indy Associates - Internet Marketing - Affiliate Program Management - SEO - twitter.com/indymike

see all my answers

Best Answers in:Internet Marketing (2)... see more

Jassim - 

Yes, there are always massive discrepancies in analytics between platforms. It's because each system measures things in different ways: 

Ad Servers: Reports clicks that result in a redirect to a web page. No guarantee the visitor makes it to the page or isn't further redirected. Can be affected by hijacking (toolbars that redirect traffic), bots, the end user, and in some cases, performance of the ad server (I clicked and it timed out!). Ad servers accurately measure ad displays and clicks. They are not so accurate at telling you how many people visited a website. 

Log Analyzers: Report on pages served by a web server. Does not see pages served from caching proxies used by ISPs and does not see pages served from a browser's cache. Log Analyzers accurately report server activity and nothing else. 

Javascript based metrics (like Google Analytics): Reports accurately if the end user has javascript and no software that blocks your tracker (7-15% of computers have this depending on who's metrics you are using). Javascript based metrics tell you within 7-15% what pages have been viewed. 

So, the best we can do is within 7-15% accuracy anyway. The poster that said measure ROI is correct. Measure sales by campaign and you are sure to get your metrics right.

posted 5 hours ago | Flag answer as...

Brian Murphy is a 3rd degree contact

Brian Murphy

Senior Sales Executive at TruEffect

see all my answers

Yes, there will be discrepancies, nature of the beast.

Brian Murphy also suggests this expert on this topic:

posted 1 hour ago | Flag answer as...

Bill McRea is a member of one of your groups

Bill McRea

McRea Marketing Concepts

see all my answers

If you take your site Awstat, Google Analytics, any Adserver and try to tie the stats together it will drive you nuts. So don't try. LOL 

But if you are running an Adwords Campaign the Google Analytics will work perfectly to show you a true ROI. If configure correctly. 

Good Luck


posted 10 minutes ago | Flag answer as...


Dash Chang

Founder at tEarn Brand Advertising Network

see all my answers

As a developer, network operator, we have observed the discrepancies. 

1. As discussed, interpretation of data matters. Hundreds of small decisions in coding affect the results. 

2. Getting 100% of the data is impossible. Internet IP transport is not guaranteed. It is well known that GA loses 20% or more of the data. Every network loses some data. We believe http://tEarn.com has one of the best at getting the most data. 

3. Technologies inflate results. As more users turn on 3rd party cookie blocking, the count of unique users can be 1x to 2x too high. That's why so many networks exaggerate UU count. It's the technology and comScore has written extensively about this issue. 

4. Most users to a site arrive accidentally. It's a symptom of Google Search that distributes searchers to millions of websites. We're collecting more data, but perhaps 50% of visitors to ANY site stay less than 5 seconds. Again, this inflates UU count. 

Over 1 billion users visit the web daily. Exaggerating UU count is silly. No true advertiser wants to buy the 1 billion audience. Few buyers can afford to ;-) It's a negative service to advertisers and publishers. 

We're be refining our system to report relevant numbers that offer true value to buyers and sellers.


No comments:

Post a Comment

Comments accepted immediately, but moderated.

Support Our Sponsors: