Offering: Flight and hotel booking services
Context: The company started aggressively blocking bots by using bot detection services
The travel website was trying to avoid bad automated web crawlers by implementing a system that could identify the traffic coming from bots and prevent them from accessing the site contents. This was done with the motive of improving security and avoiding scraping by competitors.
Bot identification factors:
Bots are usually identified by using some characteristics specific to them such as:
1. The user agent string
2. High-frequency hits
3. An identifiable pattern in the requests
After successfully implementing the bot blocking system, the company was able to block about 80% of the bots that were visiting their site. The bots that got blocked were mostly the good ones that do not try to crawl aggressively. These include popular search engines, blog directories, business listing sites, website stats sites and some other crawlers that belong to companies offering search engine optimization services.
The problems faced:
- The website’s growth became stagnant because of reduced exposure
- Traffic dropped significantly
- Bad bots still managed to overcome the blocking
- Search engine visibility got affected
Disadvantages of blocking bots
1. Drop in the website exposure
2. Lost backlink opportunities
3. Unnecessary cost to the company
4. Affects SEO negatively
5. Not effective in preventing bad bots
6. Revenue loss
Bots are an integral part of the world wide web and blocking them could have more negative effects than positive. Good bots that contribute to the exposure of a website are the ones that usually get affected by bot-blocking mechanisms instead of the bad bots that have advanced features to overcome blocking.