I work for a software company. My department is responsible for (among other things) building and distributing VMWare virtual machines to members of our sales team, who then launch them using VMWare Player to run their product demonstrations for clients.
Lately, it occurred to me that the way we update and distribute these VMs is all kinds of wrong. Here's our process for updating "Demo VM:"
- Download a fresh copy of the VM(~35 GB) from the central server
- Set it to persistent mode, then launch it and make the changes such as upgrading products to latest versions and updating licenses
- Once changes are done, shut it down and set it back to non-persistent mode, then upload the whole thing(~35 GB) back to the central server with a new folder name with incremented version number
- Whoever needs the latest version then downloads it from the fileserver(35 GB * X)
Not only does this take up a lot of network bandwidth, but downloading 35 GB of stuff from the network can be time-consuming, especially for people in our remote offices who don't have the luxury of intranet speeds.
My question: is there a better way of managing the update and distribution of virtual machines that need to be run locally on the users' machines?
The reason I started questioning our current method is that, when a virtual machine is updated, only a small portion of the files (VMEMs and virtual disk images) change, right? So instead of copying the entire VM folder, there should be a method to upload/download only the deltas, so to speak. Similar to how version control systems like Git work. I actually tried to use Git for this, but it turns out Git is terrible when it comes to managing huge files. So I figured I'd ask here.