Data Requirements: Price monitoring is done by setting up web scrapers to fetch data from product pages at a predefined frequency. Programming the web crawler is a technically demanding task that requires coding skills and continuous monitoring (as websites can change their structures).
Web Crawlers Set Up: The source pages are first examined to find the html tags that enclose the required data points. Once identified, these tags are coded into the crawling setup. When the web crawling setup is deployed, it starts extracting and saving the data into a dump file.
Data Normalisation and Data Delivery: This dump file is then processed using a cleansing and structuring setup to produce clean data. This processing will remove the unnecessary html tags and text that got scraped along with the required data.
It also gives the data a proper structure so that it can be used for further analysis and storage in database systems. After complete configuration, the crawling setup can give pricing data feeds at specified intervals.