As I understand it, everything from IIS7 and beyond can simultaneously support 32 and 64 bit applications... So why is "Enable 32 bit applications" an option w/ IIS any more? Why would it not just automatically support both?
As I understand it, everything from IIS7 and beyond can simultaneously support 32 and 64 bit applications... So why is "Enable 32 bit applications" an option w/ IIS any more? Why would it not just automatically support both?
Several third party components still have no 64-bit option and in that case a 32-bit environment is required in order for those to function.
Take, for example, Excel. If you're writing a web app that will dump output to an Excel sheets and you need to use the Microsoft Interop components then you're forced to operate in a 32-bit environment. Or, if you have a web app that uses ADO to interact with Excel you're also going to be working in 32-bit.
In response to the last bit about why not automatically support both (and in conjunction with DJ Pon3's answer), Microsoft is actively trying to push for pure 64-bit environments. With that in mind it is easily followed that 32-bit options would be disabled by default. Which makes my above example rather humorous.
Why would you want your application framework / environment to support things you knew you would not use? Every time you don't run un-needed code you potentially improve performance by using less resources, you improve robustness by having less code hence less potential for errors and you improve security by having a smaller attack face.