wget replacement?
Ed Greshko
Ed.Greshko at greshko.com
Thu Apr 1 15:10:59 UTC 2004
On Thu, 2004-04-01 at 22:42, Steve Buehler wrote:
> Does anybody know of a replacement for wget that will work as well as wget,
> but will not have the file size limit problem? wget can't get a file that
> is bigger than 2gigs in size. On the wget mail list, it is reported as a
> bug by some and as just a feature request by others. I am trying to mirror
> an ftp directory for a client so they can have a backup, but one file stops
> the wget download process. I can't find a way to exclude that one file
> from the wget download so now I have to see if there is another program out
> there that can work as well. Here is the command that I use. Yes, I have
> replaced the server IP with a fictious one. The actual IP is for an
> internet IP.
>
> wget --passive-ftp --mirror --no-host-directories --cut-dirs=1
> --directory-prefix=/home/SHARE1/ 'ftp://login:password@192.168.1.1/SHARE1/'
Just a thought...
rsync may be better for this type of work.
--
"An opinion is like an asshole - everybody has one."
- Clint Eastwood as Harry Callahan, The Dead Pool - 1988.
More information about the redhat-list
mailing list