With websites that link back to the original source for a particular piece of information, there’s an inherent problem of maintaining freshness of those links.
Let’s take an example of a digital classified ad listing company that aggregates various ads from multiple sources on the web, and links each such ad back to its source for the second half of the purchase cycle. There are 3 different things that can eventuate if a particular ad no longer exists on the linked page:
- the source URL returns a 404
- the source URL redirects to a different page (mostly unrelated) with a 302
- the target page simply displays a message saying “OOPS! That page doesn’t exist”
So what if all these messages exist? They better do right? True that, but it’s not a good idea for a consumer-facing company to display stale links. That’s simply BAD customer experience and creates bounce off spots for its audience. To delve into the problem enough, imagine you’re that website’s user trying to discover some rental properties in your location to move in. And just when you have done your research, checked out the images, fallen in love with the price and clicked on the link to contact the broker, an annoying message awaits you.
It’s highly desirable for such consumer-based companies to proactively check for URL’s validity and remove them from display if they don’t exist anymore. This is technically a simple problem to solve (for those who deal with related technologies) complimenting the high returns from a user experience stand point. PromptCloud supports such customized 404 checkers as part of its crawl offering.
How this works
– PromptCloud sets up rules for various kinds of checks to be performed
on those links. Statuses to be returned for each check is decided in consultation with the client. On a daily basis (or as frequently as desired by the client), the client uploads its master list of links to be checked for freshness, either on PromptCloud’s API or on its own FTP server. PromptCloud’s crawlers then fetch the URL pages and interpret the data fetched using the rules in place. Appropriate status messages are returned for each link and uploaded back to the API in a format as specified by the client. Simple as that!
The number of URL’s can be in the range of millions, billions or trillions (not sure if we have got to more than that yet 🙂 ). Checking freshness of URL’s and eliminating stale links from your website is a natural technology edge when it comes to building trust and affinity within a consumer network. This technique can be extended to receiving alerts on changing prices when maintaining a comparison shopping engine or being notified of products going out of stock.
Reach out to know more about this offering.