wget

Dan Horning lists at mx2pro.com
Fri Apr 16 04:19:16 UTC 2004


why not just use wget -m http://url/path and add -np if needed 

> -----Original Message-----
> From: fedora-list-bounces at redhat.com 
> [mailto:fedora-list-bounces at redhat.com] On Behalf Of Tom 
> 'Needs A Hat' Mitchell
> Sent: Friday, April 16, 2004 12:14 AM
> To: For users of Fedora Core releases
> Subject: Re: wget
> 
> On Thu, Apr 15, 2004 at 10:41:39AM -0700, Gunnar vS Kramm wrote:
> > Original e-mail from: Matthew Benjamin (msbenjamin at fedex.com):
> > 
> > > Does anyone know how to use wget to drill down to all of 
> the folders and
> > > subdirectories in a website. I can mirror my website 
> however it does not
> > > grab all of the folders which contain data that the links 
> go to. The
> > > site is password protected.
> > >  
> > > mattB.
> ....
> 
> > You should be able to use the -r switch to wget, as such:
> > wget -r http://YourWebSite 
> 
> Also, does his web site have a robots file?
> 
> 
> 
> 
> -- 
> 	T o m  M i t c h e l l 
> 	/dev/null the ultimate in secure storage.
> 
> 
> -- 
> fedora-list mailing list
> fedora-list at redhat.com
> To unsubscribe: http://www.redhat.com/mailman/listinfo/fedora-list
> 





More information about the fedora-list mailing list