I have a folder that contains files for a static website like:
/site/index.html
/site/css/css.css
/site/js/js.js
/site/images/...
If I update something on my laptop, I want a single command to send the files off to my ubuntu server. I don't want to setup FTP on it if I don't have too, wondering if scp would be able to handle this?
The command
scp -r source user@target:dest
will walk all subdirectories of source and copy them.However,
scp
behaves likecp
and always copies files, even if it is the same on both source and destination. [See here for a workaround.]As this is a static website, you are most likely only making updates, not re-creating the whole thing, so you will probably find things move along faster if you use
rsync
over ssh instead ofscp
. Probably something like...to get started. If you are doing this across a LAN, I would personally use the options
-avW
instead forrsync
.Rsync also gives you the ability to duplicate deletions in your source; so if you remove a file from your tree, you can run
rsync
as above, and include the flag--delete
and it will remove the same file from the destination side.scp has a recursive flag that will do what you want.
scp -r /base/directory user@server:/to/location
from
man scp
scp -r and rsync -r are the most reliable ways to get what you want, as others have noted.
You can also use sshfs to 'mount' it as if it were a local drive:
sshfs user@host:/site /mnt/mountpoint
(However you're probably better off working locally and deploying with rsync. Just another tool to be aware of.)
Use below command