Set proper header for crawler to prevent cached html

289 views Asked by At

Hello everyone i am building a small web crawler that fetch news from some websites. I am using Typhoeus.

My code is like this:

request = Typhoeus::Request.new(url, timeout: 60)
request.on_complete do |response|
    doc = Nokogiri::HTML(response.body)
    root_url = source.website.url
    links = doc.css(css_selectors).take(20)
end
hydra.queue(request)
hydra.run

The problem is some websites requests return a chached old versions of the page. i tried setting the headers and included "Cache-Control" => 'no-cache' but that didn't help! Any help will be appreciated.

The same things happens when using open-uri.

one of the website's reponse header:

{"Server"=>"nginx/1.10.2", "Date"=>"Sat, 07 Jan 2017 12:43:54 GMT", "Content-Type"=>"text/html; charset=utf-8", "Transfer-Encoding"=>"chunked", "Connection"=>"keep-alive", "X-Drupal-Cache"=>"MISS", "X-Content-Type-Options"=>"nosniff", "Etag"=>"\"1483786108-1\"", "Content-Language"=>"ar", "Link"=>"</taxonomy/term/1>; rel=\"shortlink\",</Actualit%C3%A9s>; rel=\"canonical\"", "X-Generator"=>"Drupal 7 (http://drupal.org)", "Cache-Control"=>"public, max-age=0", "Expires"=>"Sun, 19 Nov 1978 05:00:00 GMT", "Vary"=>"Cookie,Accept-Encoding", "Last-Modified"=>"Sat, 07 Jan 2017 10:48:28 GMT", "X-Cacheable"=>"YES", "X-Served-From-Cache"=>"Yes"}
1

There are 1 answers

0
IAmCoder On

This should work:

"Cache-Control" => 'no-cache, no-store, must-revalidate'
"Pragma" => 'no-cache'
"Expires" => '0'