I love the smell of new machines in the morning.
I'm automating a machine creation workflow that involves several separate systems across my infrastructure, some of which involve 15 year old perl scripts on Solaris hosts, PXE Booting Linux systems, and Powershell on Windows Server 2008.
I can script each of the individual parts, and integrating the Linux and Unix automation is fairly straightforward, but I'm at a loss as to how to reliably tie together the Powershell scripts to the rest of the processes.
I would prefer if the process began on a Linux host, since I imagine that it will end up as a web application living on an Apache server, but if it needs to begin on Windows, I am hesitantly okay with that.
I would ideally like something along the lines of psexec for Linux to run against Windows, but the answer in that direction appears to by Cygwin, and as much as I appreciate all of the hard work that they put in, it has never felt right, if you know what I mean. It's great for a desktop and gives a lot of functionality, but I feel like Windows servers should be treated like Windows servers and not bastardized Unix machines (which, incidentally, is my argument against OSX servers, too, and they're actually Unix). Anyway, I don't want to go with Cygwin unless that's the last and only option.
So I guess what I'm asking is if there is a way to execute jobs on Windows machines from Linux. Without Cygwin. I'm open to ideas and suggestions, including "Look idiot, everyone uses Cygwin, so suck it up and deal with it". Thanks in advance!
I've spent hours pounding on this very problem and it eventually came down to two viable options (there are a lot of non-viable options out there):
With the second option you're stuck fighting your way through the the GNU/Posix abstraction layer to get at the actual windows bits. Which does restrict what you're able to do with it.
The first option pretty much builds a web-based abstraction layer you write yourself on top of your full native-stack Windows install. If you're willing to put in the work, the Linux master-server only has to do a bunch of curl calls to do what needs doing. This works best when scripts are fire-and-forget though, as building a call-back system is a lot more effort.
You can also buy cross-platform scheduling or workflow automation software that can kick off native scripts on many hosts depending on previous actions, or even their returned results. Large enterprises use software like Tivoli, UC4, Espresso (CA dSeries, now) that does this, and I've used it at large enterprises that needed to do this sort of thing. FYI, these often have native support for things like Oracle jobs, to give you an idea of the pricetag you might be looking at.
(In my past job, they also used Cygwin anyway, so that they could use the same Perl scripts without modification when workloads moved between platforms. Lots of fun.)
You could also try to build your own, as @sysadmin1138 suggests; that would be a fun project, and might even end up robust enough to be usable and not get you paged at 2 AM when the financial exports fail on the first try.
I would use the Powershell Web Access feature introduced in Powershell v3.0. This allows you to use Powershell scripts from a Linux host.
PowerShell server lets you SSH in to a Windows server and get a PowerShell console. I haven't used it beyond the free trial, but my informal use proved to me it was a fairly reliable product.
How gross do you want to feel afterwards, because there's always telnet :)
Seriously though, why do you need the Linux server to call the PowerShell script? Can you redesign your workflow so that the Linux server simply delivers the correct boot.wim image via tftp to a PXE booted host? I've had good luck in the past keeping a Windows image with different answer files on a Windows file server and delivering a custom WinPE boot image using tftpd from a Linux host. Then, you can have the answer file call the correct PowerShell script and you don't have to deal with cross-platform yuckiness like Cygwin.
You could use something like nrpe to remotely execute the powershell script on the windows host. You might want to modify your powershell scripts to return exit codes as expected by nrpe, but there's no reason you couldn't call check_nrpe from your scripts on your linux host.
On the topic of counter-intuitive hacks, have you considered abusing Continuous Integration software as a cross-platform orchestration tool?
Install the CI master wherever is most convenient, install the agent on your Windows box (either this or this), configure a job to execute your powershell script (either directly invoking it using the Windows Batch Command config, or using a plugin if you want to write/keep your script inside the CI app) on your Windows agent and trigger the job remotely via curl or similar.
I work in a large enterprise where this issue is common. For the processes we currently support our approach is to have the Unix systems make Web calls to an "admin" Windows server that's running ColdFusion on IIS. We have classes and functions that are triggered from GET requests which use the "cfexecute" directive to launch specific powershell scripts. It's ugly but it works. We're looking at the powershell v3 Web service features to migrate away from having ColdFusion acting as a middleman.