First post, but I've been lurking here for a while.
I have been using Linux since Red Hat 7, I've recently upgraded to Ubuntu 16.04.
It is my first Linux where the GPU is actually properly supported. No black screen of death.
My question is:
If the GPU is so much more powerful than the Intel processor (Core I7 for me), why can it not handle all of the graphic duties with aplomb?
I am not a gamer, I just use the PC for Video, email, Gimp. You get the idea.
If I set the graphics to Nvidia, everything is laggy, especially VLC.
But regardless, the hit the PC takes regarding performance is really noticable. Gnome is slow, Cinnamon also.
My Raspberry Pi, model B video playback is smooth as silk (with Omxplayer).
Is it because VLC and Mplayer are not specifically compiled to use Nvidia and certain games are? It does seem a shame to be wasting all of this processing power. Or am I doing something wrong?
I'm now just using the X.org xserver and all is good. Even Intel runs the graphics faster/less laggy.
What is going on? I've hunted around for the answer, apologies if this question has already been asked before.
Thanks a lot
Here is the output of lspci -vv | awk '/ VGA /{do{print; getline}while($0!="")}'
00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) (prog-if 00 [VGA controller])
Subsystem: Lenovo 2nd Generation Core Processor Family Integrated Graphics Controller
Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx+
Status: Cap+ 66MHz- UDF- FastB2B+ ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
Latency: 0
Interrupt: pin A routed to IRQ 35
Region 0: Memory at f1400000 (64-bit, non-prefetchable) [size=4M]
Region 2: Memory at e0000000 (64-bit, prefetchable) [size=256M]
Region 4: I/O ports at 3000 [size=64]
Expansion ROM at <unassigned> [disabled]
Capabilities: <access denied>
Kernel driver in use: i915
Kernel modules: i915
01:00.0 VGA compatible controller: NVIDIA Corporation GF108M [GeForce GT 555M] (rev a1) (prog-if 00 [VGA controller])
Subsystem: Lenovo GF108M [GeForce GT 555M]
Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx+
Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
Latency: 0, Cache Line Size: 64 bytes
Interrupt: pin A routed to IRQ 34
Region 0: Memory at f0000000 (32-bit, non-prefetchable) [size=16M]
Region 1: Memory at c0000000 (64-bit, prefetchable) [size=256M]
Region 3: Memory at d0000000 (64-bit, prefetchable) [size=32M]
Region 5: I/O ports at 2000 [size=128]
Expansion ROM at f1000000 [disabled] [size=512K]
Capabilities: <access denied>
Kernel driver in use: nouveau
Kernel modules: nvidiafb, nouveau
If you are using laptop with Optimus technology, and you almost certainly are,
nvidia-prime
is an absolute ... of a driver (just a personal opinion). It doesn't support vertical sync, that is where image tearing is coming from. Citing Nvidia:I suggest you to use Bumblebee driver. It will give you some performance penalty, but given that you say that you are not a gamer, you won't notice it. I used it for years with same generation CPU and GPU as yours (second gen core CPU and Fermi GPU) and it worked flawlessly all the time.
It may take some time to setup though, especially for the first time... For starters, refer to Bumblebee wiki.