My work tends to involves using SSH to connect to various machines, and then using vim to edit files on those machines. The problem is that I have to constantly copy my .vimrc file around. It's very annoying to open vim and not have any settings. Is it possible to carry my vim settings around with me from machine to machine without manually copying it everywhere?
Instead of bringing .vimrc to each server you need to work on, why not edit the remote files from your local vim:
In vim/gvim, run:
or start vim like this:
This opens the file seamingly in place (it actually copies the file locally), and when you save, it sends the edited file back to the server for you.
It asks for an ssh password, but this can be streamlined via ssh keys.
As others have mentioned the only drawback of this method is that you don't get path/file competion as you would when working directly on the machine.
For more info, check out the following tutorial.
I feel your pain. I have all my ~/.*rc files under version control (Subversion), has worked great since I started in 1998, using CVS. One way to do it, is to check out all your rc files like this when you stand in your home directory:
This way the config files will also be synced and updated across the various computers when you run svn update.
You could make a bash script to copy it automatically every time you log in, like this:
You can call it ssh_vim, for instance. It's not an ideal solution but will solve your problem.
You can improve it to check first if there is already there. If you are not always running ssh from the same machine, you could change the script to get the file from scp from another machine.
EDIT1
On a related note, you could also mount the remote machine's filesystem with sshfs. That way you benefit from your environment and tools (not only .vimrc) and you have shell completion (which you don't have using scp://).
EDIT2
I just found out that you can source your .vimrc file using scp://, like this:
This works from the vim command line but at the moment I don't know how to automate it. It doesn't seem to work either with the '-u' switch nor in the .vimrc nor with $VIMINIT.
EDIT3
I found it! You can do this to start vim with a .vimrc taken from your host of reference:
Option '-c' executes the command right after launching vim.
You can create an alias in your shell of choice to avoid typing. In bash it would be like this:
If you're using public-key authentication, you can use this in your
~/.ssh/config
:I like it better than the script trick suggested above since it doesn't mess the invocation of the
ssh
command (when specifying extra parameters, etc.)A couple of solutions:
1) Create an NFS share for your home folder and map it in multiple locations.
2) Create a small script to push your .vimrc to the server you are connecting to with an identity/key file. It could look something like this (pseudocode):
The exact same answer as sunny256, but use git instead of SubVersion.
Keep one main branch with the files that is common for all computers, and have one branch for each new computer.
That way you can have almost the same files on most computers, and still not become to confused.
I know this is an old thread, But one way i do it is using sshfs which mounts the file system over fuse. The local vim does all of the editing, so there is no reason to copy .vimrc around.
THis does have the downside that another terminal will have to be open for any commands that need running on the remote server, But for editing i find this way the best.
It also has the added benifit of being able to use the system clipboard.
I'm using https://github.com/andsens/homeshick to manage my dotfiles, and storing them on github.
Homeshick is written in 100% bash, and helps you manage "castles" which are just git repos that contain a /home/ directory. It has commands to move existing dot files into the repo and replace them with symlinks. And to symlink all the files in the repo in to your home directory on a new machine.
So the general idea is keep your dotfiles in a version control system, and symlink to them from the real path. This way your repo doesn't need to start from your home dir and contain a ton of files you don't ever want to add.
If you are like me and have many development machines (Virtual Machines as well) for various reasons you can combine ssh keys, a smart bash_profile, and an RCS of your choice.
I would second using nfs/samaba/sshfs. One draw back is if you don't have network access all the time then you can't access what you need (flying, no wifi, firewalls, routing issues, etc). The machines that I keep in sync are not all reachable at the same time but I want to share information between them.
The following is how I went about it borrowing many ideas from the Internet.
.bash_profile could have something like this
I got this from a couple of places but can't find a link to it now. The shell_ssh_agent file:
Now on first login you set up your keys. Log out and in and it just made life easier.
Put all your scripts in an RCS, this makes keeping development machines in sync easier. I use git. Authentication with git is via ssh so ssh keys help here too. Note at this point you could have used something like nfs. I would still be a fan of an RCS for a reason which I mention below.
The use case is
Something I want to try next is wrap the initial login/setup in a makefile that I copy to the new machine. The makefile can then do the job of setting up your keys, RCS, etc. Obviously there is some overhead here but if you end up setting up a lot of machines this is:
I use a makefile that has a list of all the servers that I log on to and when I make a change on my local machine, 'make' is run using the makefile automatically which updates all the servers with any changes or any plugins