Let's assume:
I have thousands of domains on same Apache server.
Each domain is in a folder under server public_html document folder, so it can be accessed by calling "www.somedomain.com" or by calling "www.serverdomain.com/somedomain_folder"
In each domain there is a website who needs a certain script.php (identical for each domain).
From a coding point view, its obvious that it's better to use a unique script.php, so when i update it with new features/bug fixes etc, I need to update on server only one file and it will work for all domains.
But from a server point of view? If i use a unique script all domains will access it at the same time, will the server run slower compared to the situation where each domain called its own script?
I'd say it is best to put that script in your server's PHP include path and use it in all your websites.
I don't think maintaining copies for each domain will result in speed improvement. If you DO notice slowdown when using a common script, try various php/apache caching modules to speed things up a little.
And yes, as you said, you can update in one place and all websites will use the updated php script. This particular convenience outweighs many many other alternatives.
I think it's the other way round. It shouldn't be slower. The question is, "will running it shared make it faster"? As to that point - theoretically, yes. But the practical difference may be indiscernible.
The point is - if there's one script only for all users (the idea of shared libraries) it's not only easier to maintain as pointed out earlier here, but the built-in caching mechanisms should be more efficient (at least from my understanding of the subject). One file to read, easy way to cache in memory. Multiple identical files (not sym-/hardlinked, different inodes) - this is more problematic. Or am I underestimating Linux capabilities here? Please correct me if I'm wrong - my knowledge of the internals is not that profound, but I think this would result in independent seeks/buffering.
Another point - using opcode caches. I'm not sure about how it looks in different implementations, but in most cases using multiple identical scripts would result in unnecessary cache entries. However, there's one potential problem with shared scripts - when caching gets too aggressive (like caching external configuration files between different users). I had recently encountered this kind of behaviour with XCache and hardlinks. With proper configuration and non-buggy implementations it's not a problem though.
So, my conclusion:
If it's feasible, I would use shared installations both for convenience and performance. However, in some situations the difference in performance can be negligible, so it's probably "what fits your situation better".
From my experience using a shared directory of common modules won't slow your server down, otherwise you would have a unique copy of all PHP libraries for all domains.