wget replacement?

Rodolfo J. Paiz rpaiz at simpaticus.com
Thu Apr 1 15:08:45 UTC 2004


At 08:42 4/1/2004, you wrote:
>Does anybody know of a replacement for wget that will work as well as 
>wget, but will not have the file size limit problem?  wget can't get a 
>file that is bigger than 2gigs in size.  On the wget mail list, it is 
>reported as a bug by some and as just a feature request by others.  I am 
>trying to mirror an ftp directory for a client so they can have a backup, 
>but one file stops the wget download process.  I can't find a way to 
>exclude that one file from the wget download so now I have to see if there 
>is another program out there that can work as well.  Here is the command 
>that I use.  Yes, I have replaced the server IP with a fictious one.  The 
>actual IP is for an internet IP.
>
>wget --passive-ftp --mirror --no-host-directories --cut-dirs=1 
>--directory-prefix=/home/SHARE1/ 'ftp://login:password@192.168.1.1/SHARE1/'

rsync (communicating over ssh) should be a perfect solution for you and 
provide better security and functionality than wget in this case. Something 
like:

# rsync -ave ssh user at remotehost:/path/to/files/* /local/path/

is the basic command. You can use exclude and include directives to finely 
tune what is or is not mirrored, and rsync will transfer only changed 
files. Simply an amazing program. Read the man page for more details, since 
it has *lots* of power and flexibility.

You could also look at curl as an alternate, but I am not very familiar 
with it.


-- 
Rodolfo J. Paiz
rpaiz at simpaticus.com
http://www.simpaticus.com





More information about the redhat-list mailing list