Can a Windows cluster ever be cost competitive or should I just install Mono and Linux?
My application is embarrassingly parallel, requires virtually no coordination across processors nor messaging between processors, can be evenly divided between machines so they should all finish at roughly the same time, and the time to divide the problem and combine the results is small compared to the compute time so I don't mind if something like WCF (binary SOAP) is used to distribute the work rather than use some sort of cluster specific operating system that I assume would be more efficient than WCF (binary SOAP).
It's a question of moving the existing Windows/C# implementation which is too slow on a single computer over to a cluster.
Opinions please...
It sounds like a problem that would work well in Azure.
Use 1 web role where you input the job. This web role then splits up the job and places the parts in a queue.
At the backend you have worker roles, that take the job from the queue and do the work.
You do not have to invest in hardware. You only pay for what you use. You could for example run a 100 machines for an hour.
If you're looking at the acquisition and setup cost, then the answer is always going to be no.
Windows servers cost more than Linux servers pretty much across the board, and as the servers you use get more advanced, the premium you pay for Windows above and beyond the cost of the hardware goes up as well: tools like SQL Server get really expensive when you start to scale up. Also, the Windows culture is more of a for-pay marketplace, so additional tools that you may need for some side task are more likely to require you to spend a bunch of money if they're written for Windows than if they're for Linux, no matter what the quality of the program.
And when you're paying more for every server, every tool, and every license, there is no break-even point after which the cost is suddenly less.
That said, running a C# in mono on cluster of Linux servers is probably going to require some real expertise when it comes to performance tuning and all that, and Linux experts tend to to be a little more rare and expensive than their Windows counterparts. That goes double for admins who would have experience with something like mono, which is not exactly the favorite player in the Linux community.
Running C# on Windows, on the other hand, is what the Windows server architecture is all about. There's entire courses and certifications and communities dedicated to that one concept. Finding a trained and experienced and certified administrator is surprisingly easy and inexpensive by comparison.
That said, I prefer to code in C# using ASP.NET MVC2, but I run my C# code in mono under Linux. I manage both Windows and Linux servers for a living; and Windows servers are dramatically more unreliable and time-intensive to maintain. That's not FUD, it's just my personal experience.