• Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint
Share this Page URL
Help

2. Implementation and Setup > 23. Exclude Robots and Spiders from Your Analysis

Exclude Robots and Spiders from Your Analysis

One of the major complaints about web server logfiles is that they are often littered with activity from nonhuman user agents (“robots” and “spiders”). While they are not necessarily bad, you need to exclude robots and spiders from your “human” analysis or risk getting dramatically skewed results.

Robots and spiders (also known as "crawlers” or “agents”) are computer programs that scour the Web to collect information or take measurements. There are thousands of robots and spiders in use on the Web at any time, and their numbers increase every day. Common examples include:

  • Search engine robots that crawl over the pages in sites on the Web and feed the information they collect to the indexes of search engines like Google, Yahoo!, or industry-specific engines that search for information such as airfares, flight schedules, or product prices.


PREVIEW

                                                                          

Not a subscriber?

Start A Free Trial


  
  • Creative Edge
  • Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint