You’ve installed Google Analytics on your website and have begun gathering data on how people are interacting with and using your website. That’s a great first step, but there’s some additional setup work that should be done to ensure you are using clean data, as there are a few things that can skew your data.
A lot of people may not realize it initially, but YOU are often skewing the data for your own website. It’s likely that you and other employees of your company are viewing pages on your site daily. The same can also be said for any external vendors or agencies you may be working with.
Solution: Create a filtered profile in Google Analytics. To find out what you need to exclude, simply go to Google and search “What is my IP?” The answer box at the top should tell you what your IP address is. Make a note of the address, then go back into Google Analytics, create a new profile and add an IP filter.
Be sure to repeat these steps if your business has any additional office locations, or you are also filtering out any other external vendors. Just have them send you their IPs and repeat the process. Note: Always create a new profile to apply the filters to and leave the existing profile unfiltered so that raw data is available if it is ever needed. Also, filters are not retroactive, so any changes you make are only available for future reports.
Spam Bot Traffic
Small to medium sized businesses often have another issue with robots skewing their website data behind the scenes. “Spam bots” are robots that run scripts that cause the Google Analytics tracking code to fire and log a visit to a site without anyone ever actually visiting the site. This is often done dozens of times a day, but multiplied over weeks and months, could lead to a pretty large discrepancy in the number of visits to your site.
How do you know if you’re being hit by bots? Go to the Referrals report in Analytics. The spam should be pretty easy to spot as seen in the example below.
Solution: Create a custom segment in Google Analytics to remove these visits from reports. For this I usually have the segment builder and the referral report open in two separate windows on two separate monitors. In the “Advanced” section of the segment builder, click on “Conditions,” then change the options to ensure you are excluding users where the source contains the domains you view as spam in the referral report. Keep adding the spam domains by clicking on “OR” on the right part of the screen until all of the spam referring domains are included.
Unlike filters, custom segments can be applied to historical data. As for the root of this issue, Google has said they are aware of the issue, but to date, not much has been done to exclude these visits on their end.
While not necessarily negatively affecting your site metrics, you may be interested in tracking how new customers or prospects are interacting with your site versus existing clients. After all, they’re likely to browse your site in completely different ways.
Solution: A custom segment is the way to go here as well. For one of our clients, they want to focus on the quality of visits they receive from prospects and want to remove any current clients, employees and suppliers from reports.
In this example, it helps to have the URL structure of your site change to include a directory that is only accessible to logged in users. For example, logged in users to example.com could be directed to example.com/logged-in/home. To remove these visitors from your reports, create a custom segment the same way as above. Set your conditions to exclude sessions where the page contains /logged-in, as seen in the screenshot below.
We have been able to successfully implement this for a number of our clients and the new view in the data has been extremely valuable. That is the great part of custom segments - they are much more flexible than filters and allow you to slice and dice historical data in virtually limitless ways.
What else have you seen skewing your data and what have you done to fix the issue?