I have to write a scraper for class, and I wanted to put tests on it to make sure it actually works.
In the event, the website I am scraping makes changes to their website, I want our website to fail and disable the feature until we update the scraper.
Assuming I use unit testing to test the scraper, are there some tools that will run the test every X minutes and see if it is broken before disabling Y feature?
I guess what I am asking is, what are some tools that I can utilize to make my application more robust, so my users don't see some weird messages in the event my scraper breaks?
You should also save the parsed content locally. In case the remote server is down you fallback on the saved content instead of throwing your ScraperParseThingyException.
There are numerous ways you can monitor this. The easiest thing would be to keep an eye on the error log once in a while.
As this little scraper facade shows, you can configure it's internal error handling, here automatically disablying and sending notifactions in case an error happens.
Because it will be automatically disabled, your site would get back an empty result and would not need to care much any longer until you've fixed the problem.
Additionally you can use it in your test-cases with a test-configuration (e.g. throwing an exception and testing for it instead of sending out an email and disabling on error) so that you will already see in your tests, especially while developing and maintaining the component.