I'm hosting two web apps in IIS 7.5. They must share the ASP.NET Forms Authentication cookie. Since I do not want to put the machine key in the Web.config (I am not using web farms and I don't want the key visible in the web.config file), I've set it to auto-generate with the following setting:
<machineKey validationKey="AutoGenerate" decryptionKey="AutoGenerate" />
According to the documentation, this should cause IIS to auto-generate the same machine key for the two web apps. On my Windows 7 and Windows 8 developer machines, this works as expected. Either web app can authenticate the user, and both apps recognize the same auth cookie.
But, when I move this setup to production running on Windows Server 2008 R2, the two apps will not share the authentication cookie. When WebAppA authenticates the user, WebAppB cannot decrypt the auth cookie and forces the user to re-authenticate. When they do, the new auth cookie is not valid for WebAppA.
I've proven that it's using separate machine keys by explicitly setting the machine key in both apps to the same key:
<machineKey validationKey="F801AB..." decryptionKey="ADB3C1..." />
When I do this, the authentication works as it does on my developer machines.
What difference between my development and production environments might be causing this?
When IIS automatically generates a machine key, it is stored in HKEY_CURRENT_USER. Therefore, to share auto-generated machine keys, web apps must be running as the same user. Details are in this blog post. The relevant excerpt is:
In my case, the two web apps were running in separate app pools under different identities. Therefore, they generated different machine keys. On my developer machines, they were running as the same identity.
By putting the two apps in the same app pool (or running both app pools the same identity), they used the same auto-generated machine key.