Setting up a development machine, I was wondering about the way I should connect the displays. The machine has two monitors and also has two graphics cards (2x GeForce 9800 GTX+) both having 2 DVI ports. What I have been wondering is if it better to connect both monitors to a single card or one monitor to each card? Is one configuration definitively better then the other, and if not what would be the benefits & detractors of each?
You don't say which OS or whether the two cards are in the same type of slot.
On Windows, a large virtual desktop spanning multiple monitors can be set up no matter which way you connect them in your circumstances. I don't have any multimon experience on Linux and I've forgotten what i knew about OS X.
Regardless of the OS, I would think that some operations that span the two monitors would have better performance if they stay on one card and don't involve the bus or the OS. So I would recommend that you use one card until you get more monitors.
However, performance depends on the particular combination of factors in a specific configuration. The OS, drivers, slot type(s), bus speed etc. all interact. The only way to tell for sure without doing a well-designed benchmark. I would find something that does a lot of OpenGL and displays the frame rate. Run it so that it spans the two monitors (if it will let you) and try it in each potential configuration.
I can't be sure on this, but one to each card would be the logical answer. Distributing the load across both cards. I should think it makes minimal performance difference, but that's the way I'd do it.
One of the guys in the office has three monitors on his desk. Two out of one card, one out of the other. Multi-monitor handling has more to do with your graphics driver and OS support. If you're creating one big virtual desktop split down the middle, I believe that has to be done on a single card. If instead you're putting two different desktops virtually right next to each other, either way will work. Both approaches will give you that great dual-head feel, one will give you new centered windows opening up in the middle of one monitor, the other will give new centered windows bisected by your monitor split. I know which way I'd go.
For simple desktop work without heavy 3D action, I don't think it matters much. If you put the both monitors on a single card you'll have to set up crossfire/sli in order to use both cards, so one on each may be a simpler config.
Multiple monitors is pretty standard for a NOC environment, and my experience setting it up and configuring it under Linux varies wildly. Some of the new support under Ubuntu Linux can be quite fantastic, they make installing the proprietary nVidia drivers very simple indeed. Invoke 'nvidia-settings' as root afterward and you'll spy a very simple way to configure the right orientation for your monitors, works perfectly for two monitors or similar cards.
Going above 'dual head' at the moment is moderately complex under Linux, though dead simple under Windows sadly. You'll probably be looking at using Twinview on each 'Device' within xorg.conf and then tying them together with Xinerama. This may hurt performance.
Best suggestion is what I put in my first paragraph. With lots of testing :-)
I would go with the single card to reduce noise and consumption. I am sure a single 9800 can drive any practical resolution, at least for everyday computing and software development.
I'ev got three myself. The optimal way is to have one monitor per video card. Just be careful what your motherboard supports. Some only support ATI cards, the others just support NVidia.