#19/25: Web Analytics: An Hour A Day

The lack of real-world practitioners influencing strategy and direction has had a detrimental effect. Standard techniques such as customer-driven innovation have never taken deep roots in the world of web analytics. Most progress has been driven by possibility-driven innovation — as in, “What else is possible for us to do with the data we capture? Let’s innovate based on that.”

Visiting a website is a radically different proposition if you look from the lens of data collection. […] The website knows every “aisle” you walked down, everything you touched, how long you stayed reading each “label”, everything you put in your cart and then discarded, and lots and lots more.

We have clicks, we have pages, we have time on site, we have paths, we have promotions, we have abandonment rates, and more. It is important to realize that we are missing a critical facet to all these pieces of data […] is the why.

Combining the what with the why can be exponentially powerful

Methods to collect customer qualitative data

  • Lab usability testing
  • Follow-me-homes
  • Testing
  • Unstructured remote conversations
  • Surveying

Trinity strategy – get actionable insights

  • Behavior analysis – infer customer intent
  • Outcome analysis – What happened, what was the outcome?
  • Experience analysis – Why does this all happen?

Why does your website exist?

After you have an answer to Why does your website exist?, it is imperative to investigate how your decision-making platform will capture the data that will help you understand the outcomes and whether your website is successful beyond simply attracting traffic and serving up pages.

My recommendation is to have a web research team and to have the team members, researchers, sit with and work alongside the web analysis team.

I recommend that you have at least one continuous listening program in place. Usually the most effective mechanism for continuous listening, benchmarking, and trending is surveying.

User research

From the most high-level perspective, user research is the science of observing adnd monitoring how we (and our customers) interact with everday things such as websites or software or hardware, and to then draw conclusions about how to improve those customer experiences.

Lab usability testing

  • Best for optimizing UI design and work flows, understanding the voice of customer, and understanding what customers really do

Preparing the test

  1. Identify critical taks
  2. Create scenarios for each task
  3. For each scenario, establish a goal
  4. Identify the desired user group
  5. Create a compensation structure for the participants
  6. Hire the right people
  7. Do pre-tests internally

Conducting the Test

  1. Brief the participants
  2. Start with a “thinking aloud” exercise
  3. Have participants read the tasks aloud to ensure that they read the whole thing
  4. Carefully observe their verbal and nonverbal clues
  5. The moderator can ask the participant follow-up questions
  6. Thank the participants and pay them directly

Analyzing the data

  1. Hold a debriefing session
  2. Note the trends and patterns
  3. Do a deep dive analysis to identify the root causes of failures based on actual observations
  4. Make recommendations to fix the problems and prioritize them

Don’t forget to measure success post-implementation. So we spent all the money on testing; what was the outcome? Did we make more money? Are customers satisfied? Do we have lower abandonment rates? The only way to keep funding going is to show a consistent track record of success that affects either the bottom line or customer satisfaction.

Heuristic evaluation

Heuristic evaluations follow a set of well-established rules in web design and in how website visitors experience websites and interact with them.

Heuristic evaluations can also be done in groups; peopel with key skills all attempt to mimic the customer experience under the stewardship of the user researcher. The goal is to attempt to complete tasks on the website as a customer would.

Heuristic evaluations are at their best wehen used to identify what parts of the customer experience are most broken on your website.


1. Understand the core tasks

2. Establish success benchmarks

3. Walk through each task and make notes of key findings

4. Make nots of best-practice rule violations

5. Create reports

6. Categorize recommendations & prioritize


Follow-Me-Home studies

perhaps the best way to get as close to the customer’s “native” environment as possible


1. Set customer expectation clearly

2. Assign proper roles up front (moderator, note taker, video person, etc.)


1. Spend 80% observing

2. Let them do the tasks like the would normally do

3. Don’t teach them

4. The moderator can ask clarifying questions


  • optimal method for collecting feedback from a very large number of customers relatively cheap and quickly.
  • Website surveys
  • Site-level surveys
  • Usually about the experience of the site and get more context about the customer’s visit
  • Page-level survey
  • Performance of individual pages
  • Shorter than site-level surveys
  • Collect satisfaction rates or task-completion rates
  • Post-visit survey
  • Rating of the ordering process, delivery, etc.

Preparing a survey:

  • Understand the core tasks
  • Analyze clickstream to understand main holes you want to answer
  • Learn about questionnaire design


  • Implement the survey correctly
  • Incoperate cookie data, e.g. only show people who haven’t answered the survey
  • Walk through he customer experience yurself
  • Check response rates daily or weekly to pick up problems


  • Calculate correlations between answers -> helps you to understand which actions can be implemented successfully
  • Ask open-ended questions
  • Critical components of success
  • Focus on customer centricity
  • include customer and look beyond click-stream
  • How is the website doing in terms of delivering for the customer?
  • Solving for the customer means solving for long-term success


  • Primary purpose: Why are you here?
  • Task completion rate: Were you able to complete your task?
  • Content and structural gaps: How can we improve your experience?
  • Customer satisfaction: Did we wow you today?
  • Answer business questions
  • They are open-ended & on a higher level
  • Look outside your WA tool
  • Follow the 90/10 rule
  • Right organizational structure


#17/25: Actionable Web Analytics

It isn’t having the most data that wins; it’s being able to do the best analysis.

Web analysis means taking the data from web analytics and using it to make changes to your site and business decisions based on the data.

Performance marketing seeks to maximize the ROI of all web initiatives over the long term. As a result, it rejects creative branding projects that may win accolades in the industry but fail to take business objectives into account. It also rejects short-sighted tactics, such as page packing, which increases click-through rates while harming the long-term reputation of a company’s website.

Data-Driven Organization

  1. Business goals drive decison-making
  2. Avoid gut feelings
  3. Invest with business goals in mind (measurable)
  4. Employ performance metrics though the whole company
  5. Segment users according to their needs and value


  • Be willing to take risks – set performance goals for yourself
  • Present benefits
  • Rate your Data Drive
  • Do you have agreed-upon success metrics for your web channel?
  • Can most people identify the overall success metrics the same way?
  • Have you monetized your key site behaviors?
  • Do you prioritize projects and initiatives based on potential financial impact?
  • Do you evaluate all projects post-launch to determine their impact on your business?

Test your data drive

  • Do you commonly use web analytics to identify opportunities to improve your site?
  • Do you conduct attitudinal surveys to identify opportunities to improve your site?
  • Do you use competitive data to benchmark success and identify best practices?
  • Do you analyze behavioral, attitudinal, and competitive data in conjunction with one another to drive greater insight?
  • Do you include success-metrics in statements of work and RFPs for both internal teams and outside agencies?
  • Do your creative briefs include success metrics and behavioral, attitudinal, and competitive benchmarks?
  • Do you employ an ongoing testing and optimization methodology based on insight generated from data?
  • Do you segment your site to customize experiences and determine the best page solutions for different audiences?
  • Do you measure the offline impact of the web channel?
  • Do you reward employees based on performance against specific KPIs?
  • Do you regularly take action on data to improve site performance?

Analytics Intervention

  1.   Admitting the Problem
  2.   Admit your Problem
  3.   Agree that is a corporate problem > all have to work together


Business Goals:

  • What makes your business successful?
  • What will contribute to long-term success?
  • What will hurt the success?
  • What are the business goals?

Site Goals:

Awareness ladder

  • Awareness -> Did people know me? Analytics helps improve performance
  • Interest -> Website, Blogs, Forums, Review sites, etc.
  • Consideration -> More information, good information, etc.
  • Purchase -> Buying, Checkout


KPIs are

  • tied to your organization’s unique goals
  • Measured over time
  • Agreed on by the people

Goals of your website

  • Web-Monetization Models
  • Online lead-generation form
  • Phone inquiries generated online
  • Customer service savings
  • Upgrade offers
  • Prospect education
Leads closed x average revenue per sale / total sales = avg. lead value

Whenever you make changes to a support site, you should also survey people who visit it to determine whether they’re getting their questions answered – or if you’re driving them away.

Scenario modeling

Example: Ad Supported Site


Don’t stop at the sale – after sales support / etc. = they can still cancel the order

How to group data?

  • Search keywords
  • Aggregate – Top 10, 20, …
  • Cluster for groups
  • Search terms by site section / page
  • Segmentation based on visitor type
  • ZAAZ Exit Ratio: Site exits from page / page visits to that page
  • Branding Metrics
  • Direct visitor traffic
  • Perception studies
  • Repeat buyers
  • Branded searches
  • Survey offline purchasers
  • Referral actions by visitors
  • Blog / Social media buzz


  • Hot box on each page which displays segmented content
  • Noncustomers were invested to sign up for a new account
  • Online bankers were cross-sold additional financial services
  • Customers without online banking were encouraged to sign up


  • Dynamic Prioritization
  • Projects priorization based on business impact
  • Agile teams for new important projects
  • Accountability of initiatives
  • Scorecard


Process status:

  • Do you refresh your site often?
  • Do you try out options using tests?
  • Do you use insights from analysis?
  • Do you analyze all initiatives for their impact?
  • Do you document and share best practices?

Select projects:

  • Comparing Opportunities
  • Cost
  • Likelihood of reaching the target range
  • Profit potential
  • Timeline
  • Payback period
  • Total impact on site experience
  • Don’t set expectations too high
  • Identify test cases
  • Use WA to find under performing pages
  • Ask the customer
  • Do competitive research and look at other websites (!)
  • Prioritize Tests
  • What is the potential upside?
  • Ease of technical implementation
  • Ease of measurement
  • How much time will it take?
  • Optimizing Segment Performance

“The big win that can come from an analytical approach is the rapid impact and iteration that data can have on design.”

#16/25: Web Site Measurement Hacks

Funny how a major economic downturn and the enforcement of fiscal responsibility will motivate people to make decisions based on available data, not just their gut instinct.

Customer intent

  • Explicitly
    • Landing page survey: What are you trying achieve today?
    • On exit survey: Did you successfully achieve what you wanted to do?
  • Implicity
    • Obvious intent: ordering something, etc.
    • Otherwise Keywords

Data Integration

  • Marketing cost => campaign ROI
  • Customer Satisfaction data
  • Email campaigns

Try to apply analytics to your intranet

  • How much do you save using this?
  • Adaption
  • Most frequent pages

Look at

  • Broken links
  • 404s
  • Failures to respond

Start with your business objective (macroconversion) and try to generate micro-conversions

Email Marketing

  • Hard Bounces – message not delivered
  • Opening rate
  • CTR
  • Unsubscribes
  • Landing page stickiness – do they keep moving or do the CTA?

Email Testing

  • Layout
  • Format – HTML, text, rich text
  • Length
  • Tone
  • Date
  • Return email address
  • Subject line
  • CTA

When you segment by traffic, you can do something

Scent trails

  • create persona
  • Use the right language
  • build rapport by relevant copy and addressing issues
  • present relevant solutions

Measuring the internal search engine

  • Percentage of exits from the search page
  • Usage for conversions
  • Percentage with no results
  • Top search terms

Measure Recency and Latency

  • Recency – When was the last time the visitor/customer visited your site
  • Latency – Time between different visits
  • Segment into above and below average
  • Helps you to estimate future earnings

Test offline campaign – geographic segmentation

  • Control group
  • Treatment group

KPIs for Online Retail

  • AOV
  • Order Conversion Rate
  • Funnel analysis
  • Visits under 90 seconds

KPIs for Content sites

  • Average pages per visit
  • Average visits per visitor
  • Average time on site
  • Visits with over 5-10 page views

KPIs for Lead Generation Sites

  • Average hours to response

Okay book. It’s a rather outdated but may be interesting if you want to know more about the development of web tracking systems.

Reading Atlanta Analytics

All of this business about paid tools vs free tools, and dare I say the whole concept of #measure, all boils down to the fact that today, we are a tool-centric industry, often to the detriment of being an expert-centric industry. — Stop giving web analytics tools the credit YOU deserve

Atlanta Analytics is a quite interesting blog – however, there aren’t so many posts. The author, Evan LaPointe, does have some nice visions and an interesting perspective, because he comes from a finance background.
I think he makes some important points, these are:

  • It isn’t about page views or uniques – it’s about money
  • Drive actions not data
  • Be a business person not a technologist
  • Demand your share – if you increase your company’s profit by $500,000 per year, you should demand a share of it

What is web analytics?

  • Quantify today’s success and uncover usability, design, architecture, copy, product, advertising, pricing and marketing optimization that will breed even more success tomorrow
  • Web analytics isn’t:
    • WA is not the measurement of something
    • WA is not defining success but translating it
    • WA is not Omniture, Google Analytics or Clicktracks
  • Web analytics answers the following questions:
    1. Who is coming to my web site?
    2. What are they trying to do?
    3. What is the gap between what they are doing and the ideal?
    4. What are some concrete ways we can close the gaps?
    5. How can we get more of these people?
  • These answers should be answered in context of growth and profitability
  • Analyst shouldn’t become married to one discipline otherwise they are losing the big picture
  • They are central and recommendations are driven by company impact and not by personal impact
  • Even if you cannot solve a problem by yourself, you have uncovered an important problem

Three enormous wastes of your web analytics time

  1. Analytics isn’t implemented in the dev process but afterwards
  2. You care about the correct unique visitors count
  3. You are trying to match two numbers from different tools: Trends not accounting

3.5 things that keep you from finding good web analytics people

  • 1: Good WA can be in your company
  • 2: A lot of experienced WAs are actually reporting writers
  • 3: Your interview process prevents you from hiring good people: if you fear change / that your flaws will be revealed and the application is able, then you probably won’t hire them
  • 3.5: Your salary is too low: increasing your conversion rate by 0.3% can mean hundreds of thousand of dollars additional revenue per month

Web analytics sucks, and it’s nobody’s fault

This is a handmade description for yet another propellerhead analyst who will sit around and run reports for people, get in arguments with other people (or those same people), “agree to disagree” with other departments, and will eventually call everyone else an idiot and will recede into their cave before ultimately quitting for a director-level position at a different, big, resume-enhancing company where the process will repeat itself.

It’s not their fault because a good position for a web analytics person does not exist in the companies that can use these people most. The bigger the company, the more important a small difference becomes. For a site with 10,000 visits a month, an analytics person would have to improve conversion by double-digit percentages to scarcely pay for themselves. For Wal Mart, moving the conversion needle a tenth of a percent probably pays their lifetime salary in a week

The effective web analytics person knows usability, they know some design, they know information architecture, they know HTML, they are good communicators and can thusly write good web copy, and ultimately they are businesspeople who realize the purpose behind all of these crafts is cash flow […] Rather than being careful, politically aware employees, effective analytics people are data-driven, quickdraw decision makers because they have two key assets:

1. Cold, hard facts in the form of data (and I don’t mean just Omniture data)
2. The ability to not have to decide: they can TEST

Big companies are ruled by coalitions of opinions, meetings, conference calls, and semi-educated executives. Data is actually a threat. Data is what gets people fired in big companies, not what gets them bonuses. Data is scary.

What are the REAL web analytics tools?

  • Question: How can you improve the long-term cash flow?
  • Where you need a decent degree of competency:
    • Usability
    • Information Architecture
    • SEO
    • Web marketing (PPC, display, email)
    • Social Media
    • Design
    • Copywriting
    • Website technology (HTML, CSS, SQL, JS, PHP/Ruby/Python/whatever)
    • Communication skills
  • Learn business goals -> department goals -> campaign goals -> personal goals

Have you lost faith in web analytics?

  • Make decisions as often as possible – aka fail faster
  • It isn’t about the newest technology – it’s about money
  • Don’t live in a vacuum – interact with different people and viewpoints

The purpose of web (or any) analytics

  • “We talk about being data-driven businesses. But these aren’t businesses built around a culture of measurement. They’re built around a culture of accountability.”
  • “The purpose of web analytics, or any analytics, is to give your organization the confidence needed to accelerate the pace of decisions.”
  • “We’re talking about being accountable to outcomes, not to some Tyrannosaurus on a power trip. That’s a big deal.”
  • “It’s about making big decisions often.” – Iterate, iterate, iterate