Best way to download websites

I wanted to download https://privacyguides.net/ using HTTrack, but it took forever (like 1.5h) and I canceled it. What’s the best way to download only static info on the site? Also, is there any way to download a .onion site?

1 Like

Well for a start the domain is privacyguides.org not .net. You could just download the git repo.

3 Likes

To add on, since March of last year, every GitHub release for the Privacy Guides site includes archive files for offline use.

4 Likes

Thanks guys. But you only answered 1 of 3 questions. I made another topic Is there any way to download a .onion site? (to browse it locally)

This is a great question with so many websites trying to guard against users from downloading or saving data from them.

They prefer users to revisit the website over and over again to access the content on the website.

I made a thread about this that you may find helpful:

2 Likes

You meam downloading an entire website? Because for a single web page, just do ctrl+s

I think Debian means downloading the entire website so that he can access it offline?

I sent him a PM so we’ll see if he needs more help.

1 Like

Look up ArchiveBox or you could probably do it with wget in the command line.

1 Like