* Backoff RSS requests if a url repeatedly fails.
* Increase max backoff time to a day
* Add backoff for failing feeds.
* Remove unused finally
* Add this.feedLastBackoff
* Rewrite in rust.
* linting
* pop
* Optimise backoff function further
* Drop only!
* fix test
* lint
* lint further
* Better comments
* Fix urls calculation
* Remove testing URL
* Add some variance to speed up while loop
* correct comment
* Follow the advice and use a VecDeque as it's slightly faster.
* Vastly better shuffle method
* Speed up checking for previous guids.
* fix hasher function
* lint
* Content doesn't need to be calculated twice.
* Slightly more efficient iteration
* Improve performance of backoff insertion
* Configure feed reader
* lint
* Ensure appending and removing from the queue works as expected.
* Ensure we do keep urls that have been removed.
* lint
* Inc/dec metrics as queue items are added/deleted.
* Add comment
* tidy up