- What is natural and unnatural traffic?
Natural traffic is generated by "real" users of your website. In contrast, unnatural traffic is, for instance, generated by bots and crawlers. With the help of bots and crawlers, content of your website is examined. They collect important information, e.g., for search engines.
Unnatural traffic should not be included in your analysis data as it would be falsified otherwise.
Webtrekk automatically excludes all known bots and crawlers. Nevertheless, it is still possible that unknown bots and crawlers access your website. In the next steps, we will try to identify them.
- Identify unnatural traffic
A crawler will request many pages within a short time period. In Webtrekk you can find out how many page impressions in how many visits were generated by a single user.
Create the following analysis:
All Ever IDs (dimension: End device Visitor IDs; shows a unique visitor ID that was written to an Ever cookie generated by Webtrekk) of users who had at least one visit in the analysis period are shown.
URM – Customer Profile Visits: the overall number of visits of a user (during his lifetime)
URM – Customer Profile Page Impressions: the overall number of page impressions of a user (in his lifetime)
- Filter on "Natural Traffic"
The users identified in section 2 cannot be deleted from Webtrekk retroactively. Nevertheless, it is possible to exclude their traffic when calculating an analysis using the filter engine.
To do this, you can choose one out of the following two options:
A) You filter each analysis and each report using a segment.
B) You can filter the whole login.
In "Webtrekk DMP", create a segment in the "Visitors" scope that excludes the identified End device visitor ID.
Afterwards, to restrict an analysis on natural traffic, use the following filter:
Go to "User Management > Filters" and exclude in the "Visitors" scope the identified End device Visitor IDs.
- Exclude unnatural traffic from tracking for the future
Already when data is collected, unnatural traffic should be excluded. You can do this in Webtrekk by excluding IP addresses or the domain where the access came from (referrer). As crawlers usually don't come up with a domain, in this example we focus on excluding IP addresses.
To assure data privacy, IP addresses are not stored in Webtrekk. To identify IP addresses from which the website was accessed conspiciously often, you have to analyze the server log files. Here, each access to the website is logged, often including the IP address.
Add the identified IP addresses to the list in Webtrekk Q3 at "Configuration > System Configuration > Data Collection". Webtrekk also supports listing of IP ranges.