How can I download files (that are listed in a text file) using wget
or some other automatic way?
Sample file list:
www.example.com/1.pdf
www.example.com/2.pdf
www.example.com/3.pdf
How can I download files (that are listed in a text file) using wget
or some other automatic way?
Sample file list:
www.example.com/1.pdf
www.example.com/2.pdf
www.example.com/3.pdf
wget
has a built-in flag for this:wget -i your_list
, whereyour_list
is a file containing URL's delimited by linebreaks. You can find this kind of thing by readingman wget
Get them in parallel with
By default it will run as many processes as you have cores, you can probably ramp this up another 10x if you really want to pull them down quickly by adding "-j 20" after parallel.
parallel
has a built-in flag--arg-file
(-a
) that will use an input-file as the source, so you can avoidcat |
. You can useOr simply
parallel --gnu wget < urlfile
where list.txt is your list file
I saw Florian Diesch's answer.
I got it to work by including the parameter
bqc
in the command.All downloads started in parallel in the background.
-b
: Background. Go to background immediately after start-q
: Quiet. Turn off wget's output-c
: Continue. Continue getting a partially-downloaded fileLink file links.txt
Command for down load all links file
I just tested this:
It works for me. Links inside the txt file must be in separate lines.