Recently we ran into a problem where one of our Linux-based virtual machines was really slow due to a chronic shortage of "entropy".
I'm wondering if Windows virtual machines would suffer from the same problem. (A Google search gave me no relevant hits, but I could be using the wrong search terms.)
The documentation for the Windows cryptographic API does not suggest that the calls for generating a key or generating random data can fail or be delayed due to insufficient entropy. So, no, Windows does not suffer from the problem you're describing.
This may mean that in a virtual or otherwise external-entropy-starved environment some cryptographic functions might not be as secure as is desirable. However, I've never seen any analysis of this. I'm inclined to think that modern computers are sufficiently complicated that internal entropy sources are adequate, and the Linux systems are just being overly cautious - but I'm not a cryptographer, so my opinion doesn't really count!
If by 'virtuals' you mean 'virtual machines', then yes, like any other system they can have problems with shortage of entropy in operations requiring lots of real randomness. This can happen in any system, if it ends up needing lots of randomness, but doesn't have a good source.
If you are doing something that requires lots of random numbers, I'd suggest some kind of physical random number generator.