We have a server in our office that has been converted to a remote server by assigning it a dedicated IP address. A major government organization is connected to it 24x7 and this server is processing concurrent requests from all over the state.
This server was initially configured for Version 2 of .NET. We want to install a new application which is built on Version 3.5 or 4 of .NET. I want to know whether both versions of .NET Framework along with Crystal Reports can work on the same server or not?
What additional configuration is needed?
I don't recall any configuration needed on a server needed after installing the .NET frameworks (there are 7 of them now BTW). Having said that, an app is targeted toward a specific framework and not a specific version of VS. It's possible they used VS2010 and targeted the 2.0 framework, etc.
When you deploy the apps, you may have to tell IIS which version of the .NET framework to use because, if I recall correctly, is always seemed to default to version 2 of .NET.
BUT Test, test, test before you push this to your production server and ask the developer if you need to install a later framework for that app!
Edited to add@TomTom (below in comments) is correct also, when you deploy the app(s) you can set the version of .NET to use, per application/app pool. But as I mentioned above, you may ask the developer what version they targeted because it's quite possible they targeted Version 2.0 and still used VS2010.
TL;DR: I've had all versions of .NET happily running side-by-side, but TEST it in your environment.
Why are you installing Visual Studio on servers to run your applications? If you compile them properly and install just the necessary Visual Studio runtimes, it will work fine and without the overhead and licensing cost of full VS licenses. You can have two versions of VS installed simultaneously, but this is just a bad bad bad way of doing what you want.
Yes they can. No configuration of the .NET Framework should be required.