Being able to see what your customers are doing and buying on your website is a truly wonderful thing that means that we can improve user experience and make accurate data driven decisions about where to spend your hard earned cash. The problem is that all those decisions you make are only as accurate as the data you record.
Maybe you’ve spotted strange trends or fluctuations that you are finding hard to explain. Rest assured you’re not alone, it’s a question that gets asked a lot and our analytics team at Semetrical have reviewed hundreds of websites to assess the validity of their data.
Now there’s no substitute for a trained eye, however this article does seek to take some of the experience we’ve compiled over the years and condense it into a useful guide. In it we will outline some of the common red flags that we see, what they might mean, and what you can do about them.
Let’s start with a big one. “My sessions have dropped off a cliff”. The first question you have to answer is “by how much”. If the answer is 95-100%, then the answer will almost certainly be that your tracking is broken. You can check this very easily using the tag assistant plugin from Google. The other alternative of course is that your website was down, you should be able to validate this by referencing the log files.
Alternatively, “Traffic has dropped significantly but not completely”. This is a trickier one, your first port of call is to review the channel report. Is the drop seen in one channel, or seen equally across all channels?
If it’s the latter, then again you’ve probably partially broken your tracking. Take a look at a comparison view of your landing page report to ensure that the tracking hasn’t broken on a specific area of your website.
Another wise step is to see whether anything has changed on the website at that point. Send a message to whoever manages your website to see whether there was a release or update around this time. They will likely be able to advise if there was anything in the release that could have impacted tracking.
If the changes are channel related however, then this is where the number of potential problems increases. We’ll delve into some of the more common reasons for channel fluctuations later.
Sudden increase in all sessions:
Should be something to celebrate right? Hopefully so, but always best to check before we start shouting about it. Again make sure that it’s not a channel specific change first. If you see its an even increase across all channels then it’s time to take a look at the UX metrics. Firstly have a look at the pages per visit report.
If this has reduced significantly then you’ve probably got a problem with duplicating sessions. There are a number of ways that this can occur – however some common ones are
To try and identify where the issues are occurring, it can be worth looking at the exit page report and seeing where there is an influx of people supposedly leaving the website. Also look at the landing page report to see where the increase is coming from.
Firstly, there is no such thing as a good or bad bounce rate.
Whether the bounce rate is high or low, it is a metric that is very firmly specific to your website. However, what you should be interested in is how this changes. Don’t worry if your bounce rate is high, but do question if your bounce rate is higher than last week.
Bounce rate is not just influenced by what pages people visit on the website, it is also impacted by event hits. So before starting to worry that you’ve broken your site, first check to see if there have been any changes to the tracking released around the same time.
A common problem is where people forget to set events as non interactive when they are released. You then get situations where events like scroll behaviour come through as in interactive event and causes bounce rate to plummet.
Direct traffic is a result of Google Analytics not being able to identify the source of the traffic coming to your website. For that reason it can actually represent a range of different scenarios.
The more common sources are bots and dark social (encrypted social channels like whatsapp). Perhaps one of the most common scenarios we see is a large spike on a particular day for direct. 9/10 times this is going to be the result of someone crawling your website.
Whilst GA has the ability to filter out certain crawlers, it does not get them all and you will often see a large spike in sessions as a result of this.
There are a number of ways to identify bot traffic as there are many tell tale signs. However, a method we’ve found quite reliable is to segment to direct traffic – then compare the traffic to the previous day on a city level.
Servers that run the crawlers tend to be based in out of the way locations for cost efficiency, so if you see that you’ve got 10,000 visitors from a tiny town in America, you can be pretty sure you’ve been crawled. Once you’ve identified the town, you can then also create a segment for that town to remove it from your top line reporting.
First thing to do here is to look at where the source of the traffic is coming from. It could be that you’ve gained a link from a popular website, in which case congrats to the SEO team.
The problem you want to look out for however is if the referral source is your own website. In this scenario there are likely two issues that are occuring. Either, cross domain tracking has been setup incorrectly, or, you are missing tracking on a specific page.
The best way to tell whether this is the case is to add a secondary dimension to the referral source report of “Full referrer”. This will allow you to see the pages that are driving traffic, you can then manually check this page to make sure that it has tracking on it. Again, the Tag Assistant Chrome plugin is an easy way whether a page has tracking on it.
If the issue appears to be related to a subdomain then it’s worth reviewing the Google Analytics installation documents around this area.
Organic traffic can fluctuate significantly depending on your rankings within Google. These rankings are controlled by algorithms that process a multitude of factors that dictate how relevant your website is to what someone is searching for.
Occasionally Google will make updates to this algorithm that can mean certain factors are judged as more or less influential (MozCast can be a good way to identify algorithm updates).
This has the knock on effect of causing some websites to rank higher and some to rank lower. It’s also worth noting that the impact of these changes can be at page level instead of at site level.
Therefore the first step in analysis is to look at the organic landing page report. Compare a date range before the dip to one after the dip and see if there are certain pages that have dropped off.
Next step is to compare this same report to a search console report. Search console and GA will always differ slightly in their numbers so what we are looking for here is that the trends match -do you see the same kind of changes in GA as you do in search console?
If its a yes, then you are likely experiencing fluctuations in rankings and it’s worth speaking to your SEO team. If not then there is likely something wrong with your tracking and you are best looking at the landing pages to verify that the tracking code is implemented correctly.
This article touches on a number of different potential issues that could impact and we’ll go into more detail around some of these in follow up posts.
However, if you feel any of these scenarios resonate with your own data then please do get in touch. It’s often very hard to tell whether you can rely on the numbers you are reporting, so if you’d like a second opinion we’re here to help.