So, I'm trying to tackle a rather odd issue. I'm fairly new with virtualization (the most I've managed is isolating some applications with Docker and VirtualBox, which isn't hard to do). I am more questioning how the X display server works compared to actually asking for help virtualizing, though that would be appreciated.
I'm attempting to set up a host machine that uses an X display server provided by a different virtualized system. I am going to have the host machine virtualize two Linux systems.
The first Linux system is planned to host an X display server. Now, this should be rather simple to do in the end, but I'm looking for solutions that delve a bit deeper than "this works". However, I did realize other issues. I intend to run slightly graphical intensive applications on this virtualized system. Which machine should the video card be dedicated to?
The second system isn't going to be nearly as special, but there is a rather weird network confliction (not a bug; it's how I have to set things up). There won't be an X server or anything else.
My ultimate question: Would I need to utilize actual hardware for both instances, just the "host" of the X server (the virtualized system), or just the "client" of the X server (the host machine)? I would also like to ask how I might be able to accomplish this, though that's a little outside the scope of the question.