wget replacement?

Steve Buehler steve at ibapp.com
Thu Apr 1 15:16:53 UTC 2004


At 09:02 AM 4/1/2004, you wrote:
>On Thu, Apr 01, 2004 at 08:42:28AM -0600, Steve Buehler wrote:
> > Does anybody know of a replacement for wget that will work as well as 
> wget,
> > but will not have the file size limit problem?  wget can't get a file that
> > is bigger than 2gigs in size.  On the wget mail list, it is reported as a
> > bug by some and as just a feature request by others.  I am trying to 
> mirror
> > an ftp directory for a client so they can have a backup, but one file 
> stops
> > the wget download process.  I can't find a way to exclude that one file
> > from the wget download so now I have to see if there is another program 
> out
> > there that can work as well.  Here is the command that I use.  Yes, I have
> > replaced the server IP with a fictious one.  The actual IP is for an
> > internet IP.
> >
> > wget --passive-ftp --mirror --no-host-directories --cut-dirs=1
> > --directory-prefix=/home/SHARE1/ 'ftp://login:password@192.168.1.1/SHARE1/'
>
>
>How about "curl"
>        curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE, HTTP
>or HTTPS syntax.
>
>or ftp, or rsync.
>
>I must say however, that the 2 GB limit sounds like a compiled in OS or
>user resource limit.  Can your user create a file >2GB on thesame file
>system?  You may want to check that before going further.

 From what I understood about curl, it wouldn't do recursive through all of 
the directories on the other server.  I understood that I would have to 
list each and every file to do it.  Not sure where I missed that in the man 
pages.
I will look into it further.  A lot of the people are saying rsync would be 
best.  So I will look into that too.

Thanks
Steve






More information about the redhat-list mailing list