I'm working on a project where I will have to deploy a large number of servers with particular software upon them. At the moment, I'm able to use the standard Ubuntu repositories, however the version of the software that is on these repos (v1.0) is considerably older than the current version available from the developers (v.1.5.6). That leaves me downloading the source, and then building locally on each server that I intend on sending out the door. It takes about an hour to build this one particular piece of software, so I got to thinking that there's a better way of doing things.
In Googling such a solution, I see that there are a number of possibilities out there, but I wanted to get others take on whether or not I'm going down the right path, or not. If I understand this correctly, I can create a package and then place it on either a PPA (which I believe would be publicly available) or on a private repo that I could stand up. Either option would be fine really, this isn't my software I'm merely compiling it and making it available for myself and others to use.
It's my understanding then that, as long as all my servers have the same version of Ubuntu upon them, architecture etc, I could point them to this new resource and use aptitude to install the software, without having to go through the pains (and time) of compiling.
Is this generally correct? Am I over-simplifying? Is there a better way of achieving the same result?
(Cross post from here)
Yes, compiling a deb package is how you can install a single build to many hosts.
Many ways to distribute and install this package, including maintaining your own private apt repo.
If open source, yes you can create a public PPA and use that as your apt repo. Any distribution welcomes package maintainers.