Reading Adobe’s Digital Marketing Blog (Part 1)

Another reading session. This time it’s the Adobe Digital Marketing Blog. One of the first posts are from the omniture blog which is now that blog.

Getting your daily dose of dashboards?

  • “feel good reporting” = long reports nobody reads
  • Instead of reading 50 pages, provide fast 30 seconds overviews
  • Daily monitoring helps you to fix problems fast

Industry benchmarks: everything you need to know

  • Problem with benchmarks: things are measured differently
  • Metrics vary between days, geography, etc.

More on conversion benchmarks…

  • How should I act based on trends in benchmarks?
  • E.g. conversion rate down – what does this mean?
  • CTR for emails up – which mails, which basis?

Got Alerts? Don’t let this happen to you!

  • Wanted to buy a new graphics card
  • While checkout a message said “your shopping cart is empty”
  • Check but couldn’t add anything to the shopping cart
    • Activate alerts for your most critical metrics!

The Dark Side of A/B Testing: Don’t Make These Two Mistakes!

  • Example: New home page with special offers and cleaner design
  • 90/10 split test, i.e. 10% to new design
  • KPIs: home page conversion rate and revenue per home page visit
  • both increased
    • Problem: Which helped increasing the KPIs – Clearer design or special offers?
  • Solution: Don’t change more than one element on a page – it helps you understand your customer’s behavior
  • Alternatively: use MVT
  • Example: tested new thing but CV lowered dramatically
  • Look at your customers – are the new or old?
  • Especially old, frequent customers can be unreceptive to changes
  • Solution: Segment your customers, better on RFM metrics

How to Spend Fewer Dollars, Smarter and Faster, during Tight Times

  • Search: real-time, easy measurable
  • Email marketing: fast, easy measurable
  • Online video: expensive but measurable

The Challenges and Value of Digital Marketing Integration

  • Customers are
    • Better Connected – more information, can easy switch channels
    • Bombarded – tons of information
    • Empowered – can publish reviews, write on their blog, etc.
    • Savvy – higher expectation for relevant and personalized experience
  • Harder to track customers on different platforms but possible

User-generated Content and Word of Mouth Marketing

  • Conventional marketing
    • Intercept – target and expose your message
    • Inhibit – make it difficult to compare your product to other
    • Isolate – remove all third parties
  • Digital marketing
    • Attract – create incentives for people to seek you out
    • Assist – be helpful and engage with people
    • Affiliate – mobilize third parties to become more helpful
    • Analyze – find out what’s working and where you can improve
  • Not only conversions matter, they rest of the people do too!

Reaching the Individual: Site Surfers Becoming People

  • Each visitor is unique
  • Provide relevant content if possible
  • Understand the behavior of visitors – how can you improve their experience?
  • Personalize as much as possible

Don’t Do This! 7 Pitfalls When Deploying Analytics (Part I)

  1. Neglecting key stakeholders – websites touch all facets of an organization!
  2. Focusing on tactical requirements – what are your strategic business requirements?
  3. Believing data equals requirements – don’t ask for KPIs ask for strategic goals
  4. Having too many KPIs – select these KPIs that are strongly tied with your goal

Don’t Do This! 7 Pitfalls When Deploying Analytics (Part II)

  • Get too much into detail – don’t neglect the global picture
  • Mul­ti­ple ver­sions of the truth – there are differences between measurements in tools. Get over it and start looking at trends!
  • Isolate yourself – Start teaching how to use analytics and bring power users into your circle for more innovation
  • Ana­lyt­ics suc­cess is all about build­ing a base­line for per­for­mance (your KPI trend), and try­ing new things to improve on this base­line. That’s it!

How to Make Testing Successful

  • Do it right the first time, so you have accurate data
  • Start testing the important stuff and act on your findings
  • Start with politically easy things first
  • Be excited about testing and evangelize

Answers to Practical Questions about Targeting

  • Test site elements, content bits, CTA, etc.
  • Targeting helps to feel your customer at the right place
  • You can practically target everything
  • First-time visitor can be targeted by referrer, keyword, time of day, day of week, geography, browser, OS, etc.

Do You Have an Automated Response and Lead Nurturing Program in Place?

  • Strategy for lead nurturing / drip marketing
    1. Send email from your real sales staff with phone number
    2. Send relevant content for your prospect – e.g. shopping cart abandonment -> send email with reminder; different emails for different industries in a B2B setting
    3. The timing should be right – first contacts to leads should be within five minutes(!); try emails for longer periods
    4. Follow up quickly and then back off slowing – don’t spam your prospect
    5. Automate everything

The Art and Zen of Testing for Success

  • Don’t just take lift as a goal
  • Start witch question: e.g. should be button be blue or red? Is the navigation on the left or right more effective?
  • Try to answer ‘why’
  • Advantage: It isn’t about the goal anymore, it’s about insights

Reading Kaushik (Part 6): Competitive Analysis

Competitive Intelligence Analysis: Metrics, Tips & Best Practices

  • What not to do?
    1. Comparing conversion rates is hard: different business strategies
    2. Pages / Content viewed is too individual and doesn’t really matter
  • What to do?
    1. Share of Visits by your industry
    2. Compare “up and downstream” against competition
    3. Share of Search traffic
    4. Share of brand and category key phrases
    5. Discover new search key phrases
    6. Traffic by media mix
    7. Psychographic analysis

The Definitive Guide To (8) Competitive Intelligence Data Sources!

  1. Toolbar data: e.g. Alexa
  2. Panel data: comScore, Nielsen
  3. ISP (Network) data: Hitwise, Compete
  4. Search Engine data: Google AdWords, Keyword Tool, Search-based Keyword Tool, Insights for Search, Microsoft adCenter Labs
  5. Benchmarks from WA vendors: Fireclick, Coremetrics and GA
  6. Self-reported data: Quantcast, Google AdPlanner
  7. Hybrid Data: Google Trends, Compete, DoubleClick AdPlanner
  8. External VOC data: iPerceptions, ACSI

Reading Kaushik (Part 5): Qualitative Analysis

Got Surveys? Recommendations from the Trenches

  • Work with an expert
  • Benchmark against industry: http://www.foreseeresults.com/, http://www.theacsi.org/, http://www.iperceptions.com/
  • Great insights are in the raw answers to your survey
  • Don’t show the survey too early
  • Think about your target segment that you want to survey
  • Treat them as a ongoing measurement system

Build A Great Web Experimentation & Testing Program

  1. Get over your own opinions
  2. State hypotheses
  3. Create goal and success metrics beforehand
  4. Don’t neglect side-effects of testing
  5. You can start small but get bigger
  6. Get people involved in testing: e.g. let them bid on outcomes
  7. Know the techniques and theories
  8. Evangelize people about testing

The Three Greatest Survey Questions Ever

  • What is the purpose of your visit to our website today?
  • Were you able to complete your task today?
  • If you were not able to complete your task today, why not?

Experiment or Die. Five Reasons And Awesome Testing Ideas

  • Reasons:
    1. It’s no expensive
    2. It’s quite fast
    3. It’s allows you to measure change
    4. You have the ability to take controlled risks
    5. It’s quite easy
  • Ideas:
    1. Fix the worst landing pages and be bold
    2. Test single page vs. multi page checkout
    3. Test number and placement of ads
    4. Test different pricing / selling tactics
    5. Test box layouts and other online stuff

Qualitative Web Analytics: Heuristic Evaluations Rock!

  • Trying to get tasks complete: e.g. place an order, look for support, etc.
  • You can try this in a group
  • Process:
    1. Write down task that you want to see completed
    2. Try to establish success benchmarks
    3. Walk through each task and note important findings
    4. Check with a best practices checklist
    5. Create a report: screenshots / screen recording
    6. Create a prioritized list with all the problems

Reading Kaushik (Part 2): Digital Analytics

Data Quality Sucks, Let’s Just Get Over It

  • Data quality in the web is not great
  • Six step plan:
    1. Don’t dive deep into the data to find discrepancy between data
    2. Assume a level of comfort with the data
    3. Start making decision that you are comfortable with
    4. Over time start understanding small areas of data
    5. Get more comfortable over time with your data
    6. Absolute numbers rarely matter, segmented trends do

Tips for Web Analytics Success for Small Businesses

  1. Get top key phrases from search
  2. Get top referring URLs
  3. Which content is popular on your site?
  4. Percentage of Visitors on the homepage
  5. Check segmented click densities
  6. Learn about your site’s bounce rates

Measuring Success for a Support Website: A Point of View

  • Moment of Truth: hold or lose customers — web: often support problems
    1. Don’t measure unique or total visitors
    2. Identify top methods to customers find information
    3. Click Density Analysis for the Top FAQ pages
    4. What percent of site visitors call the support phone number?
    5. Measure: Problem resolution, timeliness, likelihood to recommend
    6. Check if the top solutions are corresponding to the top real problems (call center, user forums, blogs, etc.)

Seven Steps to Creating a Data Driven Decision Making Culture

  1. Go for the bottom-line (outcomes): What motives the people around you?
  2. Reporting is not Analysis
  3. Depersonalize decision making
  4. Proactive insights rather than reactive: Deliver insights before someone asks
  5. Empower your analysts: They are not reporting monkeys. Give them strategic objectives
  6. Solve for the Trinity: What, Why, How much?
  7. Create a understandable, repeatable framework for making decisions
  8. Business/Strategy should own web analytics

Five “Ecosystem” Challenges for Web Analytics Practitioners

  1. Lack of relevant talent / skills: No real format training; too much experience requested (5+ years) although the field is moving fast
  2. Entrenched mindsets: Decision makers still thinking in the old way
  3. The web is no longer a monologue
  4. It’s not about you, it’s about your customers: Less clickstream, more experimentation, usability, integration of multiple sources
  5. Web analytics is the first step

Web Analysis: In-house or Out-sourced or Something Else?

  • In the long run: in-house team
    • strategic implementation of WA can’t exist in a silo
    • Qualitative analysis is also needed
    • Tribal knowledge helps the decision making
  • But, it depends on the stage:
    • Stage 1 – No WA -> Implement and show promise from data
    • Stage 2 – Too much data -> Hire WA, customize dashboards, tag everything
    • Stage 3 – WA rocks; Asking the why -> Start testing, collect qualitative data, expand your team
    • Stage 4 – Trinity implemented -> new data, more people, a self-sustaining process

Five Rules for High Impact Web Analytics Dashboards

  1. Benchmark & Segment: Provide context
  2. Isolate your critical few metrics: less than 10 metrics
  3. Include insights
  4. Don’t create more than one single page
  5. Constantly adapt your dashboard to changes in the environment

I Got No Ecommerce. How Do I Measure Success?

  • Visitor Loyalty: How often does one person visit my website?
  • Recency: How long has it been since a visitor last visited your website?
  • Lengths of Visit: How long does each session last?
  • Depth of Visit: How many pages did they visit?

Convert Data Skeptics: Document, Educate & Pick Your Poison

  • Understand how your tools actually measure
  • Document your findings
  • Present everybody touching the data your findings
  • Report high level trends between the tools
  • Pick the best tool for each metric

Is Conversion Rate Enough? It’s A Good Start, Now Do More!

  • There is more stuff than just conversions
  • Esp. for non-ecommerce businesses, ask:
    • Have you found what you were looking for?
    • Will you go to our store?
    • Will your recommend our website?

History Is Overrated. (Atleast For Us, Atleast For Now.)

  • Value of web analytics data decays in time
  • Your visitors change too much: browsers, cookie deletion, etc.
  • Your computations change too much: new computations, maybe other tagging, etc.
  • Your systems change too much: Other hosts, new technology, etc.
  • Your website changes too much
  • Your people change too much
  • => no real tie to legacy tools and data
  • => keep some major benchmarks for comparison

“Engagement” Is Not A Metric, It’s An Excuse

  • Engagement is unique – therefore say what you actually measure instead of saying that you measure engagement.
    1. What’s the purpose of the website?
    2. How do you measure success?
    3. Define your metrics
    4. Call them what they are
  • Ideas:
    • Question: Are you engaged with us?
    • Question: Likelihood to recommend website
    • Use primary market research
    • Customer retention
    • RF of customers

In Web Analytics Context Is King Baby! Go Get Your Own

  • Never report data in aggregate, or by itself. Always always always test to see if you are including context!
  • Compare trends over different time periods
  • Compare key metrics and segments against site average
  • Report multiple metrics
  • Benchmark
  • Tap into the tribal knowledge

The “Action Dashboard” (An Alternative To Crappy Dashboards)

  • Why do dashboards suck?
    • Don’t provide a interpretation
    • Readers don’t trust the providers of dashboards
    • Not enough company context in the interpretation
    • Providers don’t have enough experience
  • How to make a good one?
    • Report only 3-5 most critical metrics
    • Show a trend in a graphic for a metric (segmented)
    • Give key trends and insights
    • Recommend actions
    • What are the impacts of the company

Multichannel Analytics: Tracking Offline Conversions. 7 Best Practices, Bonus Tips

  • Track your online store locator, directions, etc.
  • Use unique 800 numbers
  • Use unique coupons / promotions
  • Connect online and offline – e.g. club cards, delivery, etc.
  • Ask your customers (survey)
  • Conduct controlled experiments
  • Do primary research

Multichannel Analytics – Tracking Online Impact Of Offline Campaigns

  • Use vanity urls: Permanent redirects help you to differentiate between offline and online referals
  • Use unique coupons / offer codes
  • Survey, survey, survey
  • Correlate traffic patterns with offline ad patterns
  • Experiment

Slay The Analytics Data Quality Dragon & Win Your HiPPO’s Love!

  1. Change your boss
  2. Compare web data with their favorite source
  3. Put the data quality problem aside and give them actionable insights
  4. You get trends rather fast even without a complete WA implementation
  5. Start with the outcomes
  6. One WA is probably enough
  7. Realize if the data quality is good enough
  8. If you don’t have enough traffic, care about more traffic first
  9. There are inaccurate benchmarks and illegal customer behavior – don’t care too much about it
  10. Fail fast, i.e. test

Barriers To An Effective Web Measurement Strategy [+ Solutions!]

  • Note: Tools aren’t the real problem, still lots of people talk about them
  1. Lack of resources (45%): Start for free and ask your right for a budget
  2. Lack of strategy (31%): Change your job. If your a VP maybe you can help create one
  3. Siloed organization (29%): Start small and offer value
  4. Lack of understanding (25%): Do and show
  5. Too much data (18%): Limit yourself on the critical few metrics.
  6. Lack of senior mgm buy-in (18%): see previous summaries
  7. Difficulty reconciling data (17%): Whatever.
  8. IT blockages (17%): Show lost revenue by delay
  9. Lack of trust in analytics (16%): previous summaries
  10. Finding staff (12%): Don’t be to narrow minded
  11. Poor technology (9%): Concerns mostly only brand new technologies / platforms

Who Owns Web Analytics? A Framework For Critical Thinking

  • The biggest problem is most often the organization structure.
  • How long has the company been doing WA? -> New: find accepting division to embarrass the seniors – Older: see next point
  • Analytical maturity? New: find accepting people – Older: identify power centers
  • Who owns the power to make changes to the site?
  • Which model: Centralized? Decentralized? Something else? Agile team with satellites
  • Which division / department is best for WA? Marketing if they have the power -> Ideal situation: own department

10 Fundamental Web Analytics Truths: Embrace ‘Em & Win Big

  1. If you have more than one clickstream tool, you are going to fail: Implementing, understanding and communicating one tool is hard enough
  2. Expensive tools won’t give you insights: Real problems are lack of skills, terrible orga, no structure or no courage.
  3. It is faster to fail and learn then wait for an “industry case study” or find relevancy in a “industry leader white paper”
  4. You are never smart enough not to have a Practitioner Consultant on your side (constantly help you kick it up a notch)
  5. Your job is to create happy customers and a healthier bottom-line
    • Go to your own website
    • Read your own email campaigns
    • Buy something on your own website
    • Return a product on your own website
    • Do the same stuff on competitor’s websites
    • Do usability studies
    • Be a customer and ask yourself: What will create happier customers tomorrow?
  6. If you don’t kill 25% of your metrics each year, you are doing something wrong
  7. A majority of web analytics data warehousing efforts fail. Miserably: Too much irrelevant data, mostly anonymous, full of holes, BI are bad at answering WA questions and DWs are too slow
  8. There is no magic bullet for multi-channel analytics: see previous summary
  9. Experiment, or die.
  10. The single most effective strategy to win over “stubborn single-minded” HiPPO’s is to embarrass them.

Measuring Incrementally: Controlled Experiments to the Rescue!

  • Test everything
  • You may need additional personnel for conducting and analyzing the experiment
  • It gives you excellent insights in different methods
  • Act on your results as fast as you can