IIS/Web Applications has been a tricky issue in the shops I've worked in over time.
On one hand, IIS is a service built into the server (by in large) and is typically the responsibility of the server administrators to maintain and configure. When an issue comes up, they know what needs to happen, or can at least diagnose to the point where they say, "Something is wrong with the web app" and have the developer debug their code.
However, each web application on the server is unique and has a lot of nuances that gets can be complex based on the issues at hand.
On the other hand, each web application is unique in many ways and had specific issues that need to be dealt with and the developer is the person that knows the most about the application. If the web.config file needs to be modified for debugging, or an IIS starts giving grief to the web application, the developer should know where the issue lies and fix it accordingly, either due to IIS or the application itself.
However, allowing a developer to go in and tweak with IIS on their own becomes a serious issue because some settings/optimizations can seriously muck up with the server performance and stability.
So where does the balance lie? Should the server admins be IIS gurus and handle all of those issues and I simply send the site files over deployment, or should the developer assume responsibility for the server and IIS issues and deal with them accordingly?