I'd like to download a directory from a FTP, which contains some source codes. Initially, I did this:
wget -r ftp://path/to/src
Unfortunately, the directory itself is a result of a SVN checkout, so there are lots of .svn directories, and crawling over them would take longer time. Is it possible to exclude those .svn directories?
I'd like to answer this a bit broader, because the subject of this question can be found via a search engine:
--exclude-directories=list
expects absolute paths [1]. This means withhost.org/fu/bar/
you have to write--exclude-directories=/fu/bar
.This can be a problem, if you always want to exclude a folder with a specific name, no matter where it is exactly (for example a 'thumbs' folder).
For this we can use
--reject-regex
[2] like this:--reject-regex="/thumbs/"
. Given this is now regex and not a comma-separated string list, we can exclude multiple folders viaregex1|regex2|regex3
:--reject-regex="/thumbs/|/css/"
. Keep in mind that certain characters like.
have a special meaning in regex and need to be escaped to be part of a folder name:"/\.svn/"
.