Crawling/downloading a website to test permissions.
Linux for blind general discussion
blinux-list at redhat.com
Sun Oct 3 08:41:03 UTC 2021
There is at least the wget with it's recursive features. On the other
hand if you have access to the web server logs you can look there any
mr. M01510 & guide Loadstone-GPS Lat: 62.38718, lon: 25.64672
hkp://wwwkeys.pgp.net B784D020 fp:0C1F6A76 DC9DDD58 33838B5D 0E769600 B7840D02
Linux for blind general discussion kirjoitti
> Subject: Crawling/downloading a website to test permissions.
> Date: Sun, 3 Oct 2021 08:47:26
> From: Linux for blind general discussion <blinux-list at redhat.com>
> To: Linux for blind general discussion <blinux-list at redhat.com>
> Okay, so a few minutes ago, I realized at least one folder on my
> website that's supposed to be readable by visitors isn't... and that
> got me wondering.
> Is there a command I can run from the Linux terminal with my domain as
> an arguement and it'll start at the homepage, go through all the links
> and embedded images, and either generate a report of the content's
> that's accessible or download everything preserving full paths that I
> can then compare to an offline copy of the site or an ls -R thereof to
> ensure everthing that's supposed to be reachable through normal
> browsing is without having to manually follow every link?
> Blinux-list mailing list
> Blinux-list at redhat.com
More information about the Blinux-list