web page problem
Linux for blind general discussion
blinux-list at redhat.com
Sun Jan 23 09:37:30 UTC 2022
If the file is listed in some html file you can use wget with
parameters if there is certain pattern to recognize the file.
mr. M01510 & guide Loadstone-GPS Lat: 62.38718, lon: 25.64672
hkp://wwwkeys.pgp.net B784D020 fp:0C1F6A76 DC9DDD58 33838B5D 0E769600 B7840D02
Linux for blind general discussion kirjoitti
> Subject: web page problem
> Date: Sun, 23 Jan 2022 07:04:56
> From: Linux for blind general discussion <blinux-list at redhat.com>
> To: blinux-list at redhat.com
> Is it possible when a file had its version updated on its web page to get
> the urls of the new versions so the new versions can be downloaded? html
> doesn't support wild cards so this can't be done with wget. I'd like to
> be able to do this with a script if at all possible. I know perl does
> wildcards well, but don't know if perl can handle a job like this.
> If a file is on a web page it can be checked using wget with --spider
> option followed by the url name.
> Since such a case will return a 0 errorlevel, it's possible to put a &&
> followed by a wget -bc url to download the file if it exists.
> once the file is downloading
> wc -l wget-log && grep -i saved wget-log && rm wget-log
> command run every so often shows the growing size of wget-log and at the
> end will show the file name and then remove wget-log The magic is in that
> && pipe from one command to the next.
> Blinux-list mailing list
> Blinux-list at redhat.com
More information about the Blinux-list