I use my laptop with Ubuntu-desktop to do all my work, but I also have a low-end desktop over at my office just sitting there. I've decided I'm going to install Ubuntu-server on it and use it to mirror my entire laptop home folder, to make things easier when I decide to format my laptop's hard-disk.
Whenever I'm at work, both machines are connected to a network and communicate easily (and high-speed) via ssh. When I'm not at work, the desktop is still accessible via ssh. Ideally, the syncing would take place automatically in the background, whenever I change something. It only needs to be one way: the changes I make on the laptop have to be synced over to the server, but the inverse is not necessary.
I know there's software for this out there, my question is: What software can I use to achieve the above objectives and also take full advantage of local-network speeds when I'm at work? Since I'll sometimes deal with large files, the syncing process needs to realise that the two computers are sharing a local network, and then take advantage of that (instead of always syncing through the internet).
Just to be clear, over-the-network syncing is actually more important to me here than over-the-internet syncing. I ideally the software would check if the former is available and, if not, try the latter; but if that's not possible, the first case is my priority.
Hope this isn't too long. Thanks in advance.
If you can only connect to your office server via SSH, then your best choice is to use rsync: it can use SSH as a transport protocol and uses a smart algorithm to speedup the transfer of large files, by sending only the changed blocks.
Since you only need one-way synchronization, just set up passwordless SSH authentication from your laptop to the office server, and then you can start with a command as simple as:
adding
--include
and--exclude
options to refine the list of files/directory that you want to synchronize. For instance, transferring program settings ("dot files") can be risky if the two computers do not run the same OS (same version). My suggestion is to start by excluding all "dot files" (so, use--exclude="$HOME/.[a-z]*"
) and then selectively add the configuration directories of programs that can safely be shared (this has to be seen on a program by program basis). In addition, web browser cache and$HOME/.cache
can always be excluded. See the "FILTER RULES" section in thersync
man page for a detailed discussion of the include/exclude rules syntax.However,
rsync
does not have a "continuous-operations" mode, so you will have to run it periodically from your crontab.I would suggest using rdiff-backup over rsync. It's pretty much just rsync++.
unlike rsync which just a 1-1 mirror that transfers diffs, rsync has a history mechanism. So if you screwed up and deleted something important, you can revert back to a week ago and get it back.
On the ubuntu "server" you just need to have ssh running to apt-get install rdiff-backup.
on the client. I would either run this manually.. or via a crontab.
If you open up your port 22 to the internet.. you should be able to run the same script either ways.. just have it your hostname resolve locally to your local IP.
ie. backupSrv.penguins.org would resolve to the external IP of. 2.3.4.5 but inside your lan: to 192.168.1.253
There is no complete, pre-packaged solution to your problem that I know of. You probably have to write some small script yourself for that.
The syncing itself could be done by
rsync
, as already explained by Riccardo Murri. Rsync only transfers the changed parts of files, so it is perfect for this task.You can use the NetworkManagerDispatcher to execute scripts on connection/disconnection of a network interface. So you could write a script that checks if you are on the correct network and then calls rsync. This way your data would be automatically synced when you connect to your company network.
To periodically sync your data you can use cron, as already mentioned.
You should also consider security when using rsync via ssh. To synchronize without user intervention you'll need a passwordless keyfile. Anyone who has this keyfile can gain access to the server it belongs to. I would strongly recommend to encrypt your home folder to protect the data on the notebook and on the server.
DropBox | DropBox @ SuperUser http://www.dropbox.com/help/137
DropBox should be good in your case, works on Linux, mac, windows, easy setup and management.
Download DropBox
https://superuser.com/questions/147315/dropbox-to-sync-nix-home-folders
http://wiki.dropbox.com/TipsAndTricks/SyncOtherFolders
Have a read through - it does support syncing via symlinks and if there are certain doc, graphics, spreadsheets you have to sync - you still can maintain a copy of them in dropbox or permanently save them in dropbox location to save the hassle.
Try Sparkleshare: http://sparkleshare.org/
There is a client in the Ubuntu repo and it has instructions on how to setup a server on their website.
The best solution for 2015 should be Seafile: https://www.seafile.com/en/download/
Sparkleshare Developers say themselves their tool is good for many small and especially text files, but not so good for bigger and binary files..
And in terms of performance it outperforms both Sparkleshare and Owncloud.
Addtional it is far less ressource hungry then OwnCloud - so an instance can run on a Banana or Raspberry without any problems as an 24/7 running computer. (I would recommend an Banana Pi though, because of the Gigabit network and the SATA interface, it was a good choice for a file server at least for me.)
I sync my home folders with three computers. I exclude dotfiles, they are handled over a git repo and some logic around because - as Ricardo mentioned - there are some problems in sharing dotfiles and other configuration over different computers, even when its the same OS and release. Think about the monitors.xml for example, or some different layout for an IDE on you laptop and your multiscreen desktop - or a program opened on both computers.
Additonal to syncing, I backup this data - via borgbackup. Its a very nice new backup program, which does deduplicating, is very fast after the initial backup and and has a lot of features many tools are lacking.
Its the perfect setup for me.