Web Analytics Association Seattle Symposium Notes

Perhaps you missed this sold-out event in Seattle? If so, help yourself to my notes.

>> 1:15 Social Media Metrics – Jim Sterne
>> 1:45 Web Analytics Countdown - Eric Peterson & John Lovett
>> 2:45 Lessons Learned - Joe Megibow, Expedia.
>> 4:15 Experimentation - Ronny Kohavi, Microsoft

1:15 Social Media Metrics – Jim Sterne


Social media can do all sales functions now (advertising – sales – support). Jim breaks this into six measurement components: awareness, attitude, influence, competition, outcomes, value.

Awareness – reach and frequency create awareness. Blog post, tweet, and comments on a blog are like a billboard. We hope they notice. Did they get the message? Measuring awareness is familiarity. Attributes include themes and qualities.

Now how do they feel about it? That's attitude. Sentiment analysis. Straightforward vs coy. Look for useful trends. Need 150 plus postings on topic conversations per day to get enough to measure.

Influence. Technorati started it first. About rank and authority. Measure potential reach and effective reach on twitter. Influence topics as well.

Measure / compare and contrast with rest of world – specifically competition.

Outcomes. Like, follow, retweet, engage. Does it drive traffic? Satisfaction? Should we invest the next time around?

Value. How much is a social media participant worth? Which channel is delivering? Facebook, ads with your friends included as like on the ad and in the news feed. This is the impact of belonging to the tribe. Purchase intent is 4x higher. If friends like it, I should too. Tribes.

How to measure social media value? If you know how much a customer is worth, then you can figure out how much to spend to acquire and which channel is worth it. Shampoo example. New customer lifetime value of $29 profit. Spent $2 to get customer, have $27 left. If 5% encouraged to try. If 5% remain loyal. 10,000 shampoo users then value of social media is $7.25. Must know lifetime value for this.


1:45 – Eric Peterson and John Lovett


Countdown. Impending changes for web analytics.

#10 – Page tagging forever changed. Too many. Tag management systems/platforms. Separate process from deployment. Huge change. Renaming universal tag.

#9 – Big fish eats little fish. Acquisitions all around us. Only one stand-alone which is Webtrends. The value of web analytics is being understood by businesses. This fuels platforms and applies to marketing automation, business tools. Digital data is moving to the board room. Expect acquisitions on the larger scale. Big fish to big fish.

#8 – Mobile analytics changes your business. How do apps compare to mobile site. Need a whole new framework to understand. Social and video too. Peterson has new paper here. Fundamental measures. Interactions and visits. Engagement. Satisfaction.

#7 – Social media metrics begin to standardize. Represents an opportunity to speak in plain English. To talk in ways people understand. Compare across companies and industries with standardized schema.

#6 – New analytics workflow emerges. Plugging in and mentoring to fill the open recs.

#5 – Data integration. Global organizations collecting lots of data. Experience must be unified. Yet brands treat you as disjointed data. Orchestrated messaging. Meaningful. Leads to better brand relationship.

#4 – Bifurcation – Web analytics bifurcation. Traditional tools not widely adopted. Organizations now building a dual solution. WebTrends and Google Analytics. Core system vs something simple like Google Analytics. Absolute numbers in WebTrends core. Directional data in Google. Cannot continue to invest in large tool training within enterprise. Right tool. Cannot continue to say it’s easy, that’s "crap".

#3 – A Virtual NOC materializes. Web plus social media with alerting functionality. Data driven decisions. Live on the web. New company called Metrically. Multiple sources in virtual cloud.

#2 – Consumers control their own privacy. We have been demonized in the WSJ. We need to control it as an industry. 

#1 – Advocacy trumps ignorance. Advocacy helps keep the tracking in place. Code of conduct. Ethics. Not doing anything malicious for tracking. Industry is threatened. Agree to or not to associate anon info to a specific person. Unless it’s clear to consumer. Customer is concerned with fear, uncertainty and doubt. Key is value exchange. Not just what you collect, but what you give back.What is the value exchange. Germany proposing to opt in for tracking cookies.


2:45 – Joe Megibow, Expedia.


How do we get to web analytics applied? Pull all the analysis functions together. Enterprise data warehouse (but had no business direction). Imagine if we took it all and put it in one organization. Global Analytics and Optimization. Site conversion and VOC. Innovation and testing. Stat modeling. Business datamarts. All analytics.

Six lessons on what he learned.

#1 - If you are not working with your peers, you are competing with them. People take the results they like. Doing the same thing twice. Are you working with finance, other datagroups?

#2 – Learn from Finance. CFO has the truth. He has seasoned analysts who can construct predictive forecasts and meaningful narratives.

#3 – Don’t just count – DO. In some cases this means doing less. Go from data collection to data action. Do less but accomplish more.

#4 – Sign up for results. Willing to bet your job and your team on what you found. This is managing a P&L is ultimately about.

#5 – Manage expectations. Not simple problems with overnight solutions. Leaders just want plans and forecasts they can count on. Don’t say two weeks and take a month.

#6 – Start small, communicate a lot. Communicate to everyone. Sell your own successes to everyone. Earn the right to do more. Find little business wins and proactively deliver.

Used data warehouse to pull all details together for VOC feedback. Found a bug. Now loading raw Omniture data into enterprise warehouse. Took a couple years to pull in, clean and vet with financial data. Now run multiple custom models. Now getting serious about innovation testing. Expedia has home grown innovation system. Changed product dev cycle to innovate, iterate and get products to market faster.

Find something, sell it. Drive improvement.


4:15 Ronny Kohavi, Microsoft


Controlled experiments. Users come in. Split into two groups A/B. Collect metrics of interest. Are the averages interesting enough? Establish causal relationship. Prove some change causes a change in the metrics.

"Any statistic that is interesting is most likely a mistake." Twyman’s law.

Mandatory birthday field – will give you lots of repeat entries. Optional drop down gets lots of A-entries like Astronauts for jobs field.

Breakthrough business results with MVT. (recommended book)

Half the time business experiments failed to show improvement (in general). At Microsoft 2/3 of the time they do not improve the metrics they thought they would improve. We all fall in love with our ideas.

Try experiments more quickly. Try radical ideas, you may be surprised. Doubly true if it’s cheap to implement.

“If you’re not prepared to be wrong, you’ll never come up with anything original” Ken Robinson TED.

OEC – overall evaluation criterion. Make sure to agree on what you are optimizing. This is hard. Think about customer lifetime value, not return revenue. Criterion could be a weighted sum of factors. Such as time on site and visit frequency. Measure a lot of things (100) and identify a few related to the OEC.

Cultural change. People don’t want to run the experiments because it threatens their job. May make them look bad. Editors and designers especially.

Stages people go through:

1) Hubris – we know what to do and we are sure of it.

2) Insight through measurement and control. Starts to measure and often surprised.

a. Control all the differences.

3) Semmelweis Reflex. No one believes him. Reflex rejection of new knowledge because it contradicts entrenched norms, beliefs or paradigms.

4) Fundamental understanding.

Only two things can change: random chance (stats) and the subject of the A/B test.

Issues: Scope – not applicable everyone.

How to measure anything (recommended book).

Quantitative metrics are not explanations of why. Treatment may increase page load time 250ms. May lose because of a secondary effect like performance.

Primacy/ novelty effect. May take longer, do focus on new users.

Best practices:

Run AA tests. AA tests better tell you no difference. 5% is stat different. Data matching system of record. Powerful technique to find bugs.

Filtering robots causes many tests to fail. Some look like human users and execute Jscript. Spikes are often traced to single users “robots”.

How many users expose to treatment? 50%. To get stat power you need this much. Ramp up slowly over hours or days. Works out well with the math. You can discover really bad with just .1% sample.

Summary
1. Empower the Hippo with data-driven decisions

2. Hard to assess the value of ideas. Prepare to be humbled

3. Compute your stats carefully. Reliable results are hard. Run AA tests.

4. Experiment often.

http://exp-platform.com/

See Ronny's slides