wget replacement?

Michael Schwendt ms-nospam-0306 at arcor.de
Thu Apr 1 19:10:28 UTC 2004


On Thu, 01 Apr 2004 09:16:53 -0600, Steve Buehler wrote:

> > > wget --passive-ftp --mirror --no-host-directories --cut-dirs=1
> > > --directory-prefix=/home/SHARE1/ 'ftp://login:password@192.168.1.1/SHARE1/'
> >
> >
> >How about "curl"
> >        curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE, HTTP
> >or HTTPS syntax.
> >
> >or ftp, or rsync.
> >
> >I must say however, that the 2 GB limit sounds like a compiled in OS or
> >user resource limit.  Can your user create a file >2GB on thesame file
> >system?  You may want to check that before going further.
> 
>  From what I understood about curl, it wouldn't do recursive through all of 
> the directories on the other server.  I understood that I would have to 
> list each and every file to do it.  Not sure where I missed that in the man 
> pages.

How about lftp or ftpcopy? Both support mirroring.







More information about the redhat-list mailing list