wget replacement?
Jeff Kinz
jkinz at kinz.org
Thu Apr 1 15:02:36 UTC 2004
On Thu, Apr 01, 2004 at 08:42:28AM -0600, Steve Buehler wrote:
> Does anybody know of a replacement for wget that will work as well as wget,
> but will not have the file size limit problem? wget can't get a file that
> is bigger than 2gigs in size. On the wget mail list, it is reported as a
> bug by some and as just a feature request by others. I am trying to mirror
> an ftp directory for a client so they can have a backup, but one file stops
> the wget download process. I can't find a way to exclude that one file
> from the wget download so now I have to see if there is another program out
> there that can work as well. Here is the command that I use. Yes, I have
> replaced the server IP with a fictious one. The actual IP is for an
> internet IP.
>
> wget --passive-ftp --mirror --no-host-directories --cut-dirs=1
> --directory-prefix=/home/SHARE1/ 'ftp://login:password@192.168.1.1/SHARE1/'
How about "curl"
curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE, HTTP
or HTTPS syntax.
or ftp, or rsync.
I must say however, that the 2 GB limit sounds like a compiled in OS or
user resource limit. Can your user create a file >2GB on thesame file
system? You may want to check that before going further.
--
Jeff Kinz, Open-PC, Emergent Research, Hudson, MA.
"jkinz at kinz.org" is copyright 2003.
Use is restricted. Any use is an acceptance of the offer at
http://www.kinz.org/policy.html.
More information about the redhat-list
mailing list