[ltp] patch rollup for 2.6.20

Richard Neill linux-thinkpad@linux-thinkpad.org
Thu, 12 Apr 2007 13:27:10 +0100


Hendrik Baecker wrote:
> Guillermo Juárez schrieb:
>> Is there any quick way to download all the required files? I mean with
>> wget. I don't want to be clicking and saving one by one.
> Just a static list, but it should do the trick:
> 

If you have a file containing a list of urls, one per line, (and none 
contain embedded spaces), do:

for file in `cat url_list.txt`; do wget $file; done


[In general, you can solve this sort of problem by using the "read" 
builtin; you may need to set IFS to be "\n". If there are embedded 
spaces, you have to be careful about quoting them. However, here, you 
can just rely on bash using whitespace separated args.]

Aside: bash scripting can be arcane, but it really makes life easier in 
some cases.

Brief intro:  http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO.html
Detailed: http://tldp.org/LDP/abs/html/



Regards,

Richard