11 Common Analytics Pitfalls to Watch Out For

Home / Library / eCommerce / 11 Common Analytics Pitfalls to Watch Out For

The following is a short extract from our book, Researching UX: Analytics , written by Luke Hay. It’s the ultimate guide to using analytics for improved user experience. SitePoint Premium members get access with their membership, or you can buy a copy in stores worldwide.

When you first start analyzing data, it’s easy to make mistakes, particularly if you’re new to analytics. Don’t let that put you off, though! This section lists some of the main pitfalls, and how they’re best avoided—to ensure your analysis paints a true picture of user behavior.

Confusing Visits and Views

Different analytics tools will use different terminology to describe the same thing. For rookie analysts, this can cause confusion, and can mean that the wrong data is reported. Even within the same tool, terminology can be confusing. One of the most common mistakes people make is to confuse visits and views.

A visit (now known as a session in Google Analytics) generally describes a group of interactions one user takes within a given time frame on your website. A view (or “pageview” in some tools) describes a view of a page on your site that is tracked by the analytics tracking code.

These are two entirely different things, but visits and views are sometimes used interchangeably when people talk about their analytics. As you can imagine, this can cause problems for analysts, as reports will become inaccurate. Make sure you understand the terminology, so that you know what you’re reporting on. (See the Google Analytics glossary at the end of this book if you’re unsure.)

Obsessing over Visits and Views

When it comes to analyzing your data, you need to make sure you’re analyzing the most important areas. A very common mistake people make is to focus purely on visits and views. Because you’re a UXer, I know I don’t need to convince you that there’s more to a website than just a lot of people visiting it! You may still find yourself under pressure, though, to increase page views or even visits. Leave this side of things to marketers, and focus your efforts on the numbers that relate to user experience.

Getting Drawn into the Numbers

Quantitative data is all about numbers. If your account is set up correctly, the numbers don’t lie! Despite this, you need to make sure you don’t forget what the numbers actually represent: real users.

As stated previously, the numbers will tell you what happened, not why, and this is why it’s important not to forget to ask why. You’ll need to look beyond the numbers and consider their context. Make sure you don’t fall into the trap of just reporting what has happened: be sure to consider the bigger picture and think about what the numbers mean for the user experience of your website.

This is where you’ll need to bring in the qualitative methods we touched on previously. You can often use analytics to find a problem, and user research methods to solve it.

Thinking Low Numbers Are Always Bad

One side effect of getting drawn into the numbers is that you automatically consider low numbers, or a drop in numbers, to be bad. While a drop in purchases is likely to be a bad thing, a reduction in the time users spend on particular pages, for example, could be good or bad.

If you’ve redesigned the home page on a website and the time people are spending on it drops, this could be due to the improved efficiency of your design. It may be that people are able to navigate more quickly to areas of interest to them. Once again, context is key here. Work out what any drops actually mean for the website as a whole, rather than assuming they’re always going to be negative.

Confusing Correlation with Causation

Just because something happens to your analytics at the same time as you make a change to the website doesn’t mean the two are connected. If you notice changes to your analytics after making a change, you need to be sure it’s not a coincidence and that the two are connected.

You’re likely to have to delve a little deeper into your reports to prove that the rise in conversion rate was due to your great new design. This is covered in more detail in Chapter 6, but it’s something you should be aware of before you take credit (or blame!) for any sizable shifts in your reporting data.

The graph below, taken from tylervigen.com , shows a correlation of close to 95% for cheese consumption and number of people who died by becoming tangled in their bed sheets:

correlation between heese consumption and death my becoming entangled in bedsheets

There’s also a strong correlation between ice cream sales and drownings at sea, as both go up in the summer. Only an analyst severely lacking common sense would say that ice cream causes drowning, though!

The correlation versus causation issue is probably the most prolific mistake I see people make when analyzing data. When it comes to website analytics, one example of this might be where data shows that people who use site search covert 50% more than those who don’t. This could convince UXers to encourage more people to use the site search. However, the more likely correlation is that people who use the site search are a more engaged audience than the average users, and also have a better idea of what they’re looking for—meaning that they naturally have higher conversion rates.

Combining quant and qual (and sometimes your own common sense) will help ensure you don’t fall into the trap of confusing correlation and causation. Split testing is also a great way to determine true causation, and will help to protect against drawing incorrect conclusions from your data. We’ll cover split testing more in Chapter 6.

Grouping All Visits Together

As UXers, we know that different people use websites in different ways. We also know that the same person is likely to use a website differently when using different devices, or even using the same website at different times of the day. We need to include these considerations of user behavior in our quantitative analysis.

Continue reading %11 Common Analytics Pitfalls to Watch Out For %