We've got the application at work that just sits and does a whole bunch of iterative processing on some data files to perform some simulations. This is done by an "old" Win32 application that isn't multi-processor aware, so new(ish) computers and workstations are mostly sitting idle running this application.
However, since it's installed by a typical Windows Install Shield installer, I can't seem to install and run multiple copies of the application.
The work can be split up manually before processing, enabling the work to be distributed across multiple machines, but we still can't take advantage of multiple core CPUs. The results can be joined back together after processing to make a complete simulation.
Is there a product out there that would let me "compartmentalize" an installation (or 4) so I can take advantage of a multi-core CPU? I had thought of using MS Softgrid, but I believe that still depends on a remote server to do the heavy lifting (though please correct me if I'm wrong).
Furthermore, is there a way I can distribute the workload off the one machine? So an input could be split into 50 chunks, handed out to 50 machines, and worked on?
All without really changing the initial application?
In a perfect world, I'd get the application to take advantage of a DesktopGrid (BOINC), but like most "mission critical corporate applications", the need is there, but the money is not.
Thank you in advance (and sorry if this isn't appropriate for serverfault).
Long story short, if the process isn't multi-processor or distributed computing aware then no you can't make it so yourself.
You could use simple virtualisation (Virtual PC, VMWare's desktop product) to run several copies 'side by side' on the same hardware but this will not be "multi-processor aware" or "distributed" computing, it'll be several completely separate processes working on different work queues. Its hard to tell from your post but I think that may well be all you're trying to do anyway.
You may get better help on Stack Overflow -- Basically what you want to do is install N copies of the software and write a "front end" that takes the workload, chops it into N pieces & gives it to the isolated "back-end" copies of the program (automating the manual division you described).
The actual implementation (the separate installations & front end) depends on how your software is written. Also if your finished work needs to be re-integrated (you have 4 chunks that need to become one piece) the custom front-end needs to be smart enough to do that too...
You've said you can't install more than one copy of the software, but can you run more than one copy of the software? You shouldn't need to do the former to be able to do the latter, I would think. If something in the software prevents multiple running copies, you might be able to hack up the EXE with a resource editor to remove or bypass that limitation.