IIS/Web Applications has been a tricky issue in the shops I've worked in over time.
On one hand, IIS is a service built into the server (by in large) and is typically the responsibility of the server administrators to maintain and configure. When an issue comes up, they know what needs to happen, or can at least diagnose to the point where they say, "Something is wrong with the web app" and have the developer debug their code.
However, each web application on the server is unique and has a lot of nuances that gets can be complex based on the issues at hand.
On the other hand, each web application is unique in many ways and had specific issues that need to be dealt with and the developer is the person that knows the most about the application. If the web.config file needs to be modified for debugging, or an IIS starts giving grief to the web application, the developer should know where the issue lies and fix it accordingly, either due to IIS or the application itself.
However, allowing a developer to go in and tweak with IIS on their own becomes a serious issue because some settings/optimizations can seriously muck up with the server performance and stability.
So where does the balance lie? Should the server admins be IIS gurus and handle all of those issues and I simply send the site files over deployment, or should the developer assume responsibility for the server and IIS issues and deal with them accordingly?
Sounds like what you really need is someone with expertise on both sides of the fence.
In my experience (with smaller-sized companies), the IT/sysadmin staff doesn't have the time, interest, or webapp-specific knowledge to properly maintain IIS setups. They'll take things as far as the operating system and hand off IIS to me, the developer.
Obviously, I need to be "more than just a coder" to make this work properly; I have to be aware of system-level issues (security and whatnot). I've been doing low-level systems management for years so I'm confident with this sort of task (in fact, I've taught professional sysadmins a few things over the years). However, not every developer has this capability.
Still, from what I've seen, there are more developers with sysadmin skills then there are sysadmin with (webapp) development skills.
As always, YMMV.
I would personally not want a developer to mess around with IIS, especially if it meant that it might cause problems with another application with another developer having to trouble-shoot, on and on.
If there are IIS problems, have the SysAdmin look into it, and if there is a problem with a particular app, send it back to the dev. If the dev has an issue, bring it up to the SysAdmin, who can then attempt to make an informed decision as to whether to make any changes and figure out how it will affect everyone.
We (the sysadmins) treat our developers just as we would a 3rd party vendor - when they want us to deploy an app, they have to provide documentation if they expect it to be supported. This includes common troubleshooting routines and a support escalation path (uptime requirements combined with a documented developer responsibility in the case an unacceptable outage).
It's obviously not black and white, but it's done a lot to ease the tension between devs and admins. The devs now realize that they have to provide software of a quality inversely proportional to their willingness to be paged after hours, and the devs now have tools and docs to go through without feeling on the hook for tools they didn't create.
So, in your scenario, that would mean the devs create their app on their own IIS server and then provide the software and documentation for the admins to install on the production server.
Answer: find one person and annoint them "WSA" (Web Server Administrator). They could be an admin or a developer; it really doesn't matter. But they need to immerse themselves in both aspects of the job, and the rest of the team (on both sides) needs to respect their expertise.
It's no different than how DBAs straddle the line between IT/dev. Given the importance of web servers in an organization with a web-based product, I think this is a critical, and often overlooked, role.
Since the web is still young (compared to databases), it's hard to recruit this individual. You will most likely need to grow/groom someone into the role.
With new utilities like the Web Deployment Tool (that will become the standard builtin way to publish a web app starting in Visual Studio 2010), Microsoft seems to be heading down the path toward letting developers or at least installation engineers choose things like IIS settings (certs, app pool settings, etc). They get built into the msdeploy installation package and automatically applied to the IIS server when the package is deployed to servers.
Seems like a reasonable compromise. Developers don't go manually mucking with settings on the live production servers, and sys admins don't have to have the webapp-specific knowledge. And yet the desired IIS settings are clearly visible to sysadmins who want to understand what is going to happen prior to the package getting installed.