Why don't web browsers cache the sites we visit for offline use?

120 views Asked by At

I would love it if the pages I went to got stored somewhere on my computer locally, there was a set amount of space that was allocated for this function, and as soon as it was full older stuff would be deleted. Ideally if I go to a site like https://golang.org/ and I click around the documentation, I should be able to access all this content when I'm offline again. Why not?

Ideally I love the idea that I could just go straight to that url and act like I have internet. I've gotten close to doing this with sitesucker and editing my host file to create a kind of "intranet". But I never can remember the pages I visit, or capture all of them.

I can understand from a security perspective it might not be best to have the server out of sync with the domain. It seams like an easy way to get peoples Facebook passwords if you had access to the router. If that's the problem we could prepend the url like view-source:google.com works in chrome and make it something like offline-cache:golang.org, so golang.org might redirect to that if we're offline.

Shouldn't browsers be able to handle this? Does this exist already? If not is there a reason it doesn't?

1

There are 1 answers

0
ThomasReggi On BEST ANSWER

Just found this amazing post by Addy Osmani titled Offline Browsing Secrets

Chrome has a flag to Enable Offline Load Stale Button

When a page fails to load, if a stale copy of the page exists in the browser, a button will be presented to allow the user to load that stale copy. #enable-offline-load-stale-cache

All you need to do is go to about:flags in the address bar.