The biggest benefit of virtualization is usually said to be improved server utilization.
But why do I need virtualization for that?
Say I got N physical servers that are lightly used. Why don't I just combine all the apps on those N servers into 1 physical server? This way I don't incur the performance penalty of virtualization.
What does virtualization buy me in this case?
Virtualization can be great for separating those applications.
Maybe your applications can’t all be installed on the same server or maybe for security purposes you don’t want them on the same server — if one gets hacked, only that one gets hacked.
Maybe you host applications for other people and want to give each person their own individual “machine.”
Maybe you have 10 of the same server and you only need one most of the time, but sometimes when your load gets higher you need a few more. This way you don’t need to boot up 9 more servers, you have everything consolidated and running on one (or a few virtual servers).
There are many reasons to use virtualization (and probably just as many not to use virtualization).
The ability to run two separate things that have different requirements and dependencies. It is particularly valuable when those requirement conflict. For example lets say you have a some old enterprisy app that only runs on Windows 2000 server, but you also have a new shiny application that requires Windows 2008 the .Net framemworks and so on, those two applications can not be run within the same OS, but with VMs they can run on the same hardware.
It is useful if your various services have different maintenance cycles. With lots of VMs you can update/restart your secondary DNS server VM without much impact. But if that DNS server is also your file server, print server, mail server, and so on, then scheduling that update will be a lot more difficult.
IMHO, the biggest benefit from virtualization is isolation.
Your idea of just putting all the N sets of applications together has all sorts of risks. What if all of a sudden one needs a patch that requires a reboot? You patch and then reboot all the others; with multiple VMs, you reboot just the one. What if one app runs best on RHEL 5.2, while another likes Suse 10.0? What if one app absolutely requires version X of Oracle, and another absolutely can't use version X? (We've all seen this kind of thing!).
Being able to use one physical system to run N application sets, where said apps have no possible (or at least, highly unlikely) chance of interfering with each other is often a huge win.
in the past apps were siloed because if say app "A" need a server reboot app "B" would not be penalized. With that said more and more applications are becoming more resiliant/isolated thus server reboots are less likely to be required. Its a mindset problem thats hard to overcome.
There's a perceived security benefit from running a virtualized environment.
There's a perception that now we've have failed since the 1960s to create a secure OS, we can all of a sudden create secure virtual machines, where one virtualized environment cannot interfere with another. This is nonsense, of course. We can't even created CPUs that completely protect two running processes from each other.
More complexity and more lines of code simply means more bugs.
When you want a clean environment for compiling/testing purposes, a virtual machine is handy for avoiding additional hardware purchases, but that's really the extent of it. In many cases, properly designed applications could be run exactly like you describe, each running in their own userid where the OS decides which resources can be accessed.