Assuming the components like *.dll's, *.ocx's installed and registered on the clients there exists the possibility to place the *.exe and some related files on the fileserver. To start the application there are links on the client's desktops which runs the \\fileserver\AppPath\exe
.
Would you agree with such a layout? What if 'the client' is a terminal-server and 'the clients' means a terminal-server-farm?
IF the application is as simple as a single .exe file (maybe with a bunch of support files which can also reside in the same directory) and it doesn't need a proper setup on the client, this can be done and works well; it also has the added benefit of not having to worry about deploying/mantaining/updating the application on each client.
Of course, if client computers lose network connectivity and/or the file server goes down, the application will be unusable.
Seperation of roles is an important concept to apply here. If you have a network share that is hosting user documents, you can apply certain backup, availability and security approaches to that share depending on things like the kinds of files it hosts, and what times during the day users may need to access it. You also know that if you have to take the server that hosts it offline during the day for maintenance, you just need to notify your users to close any open documents for the duration.
Now consider that you've added some software into that file share, that users run directly from the share. Suddenly, you're backing up program data alongside user file data. You now have extra complications if you need to take the server down for maintenance (what happens if the server goes offline while the app is running?). This is one example of many, where your needs for administering the program and your needs for administering the rest of the file share may different conflict. You can't always predict these kinds of conflits ahead of time, but it's one of those things that more often than not leads to administrative headaches.
So this is why you should seperate functions and roles out wherever feasible, if their characteristics are different, or ever likely to diverge in the future. It's more elegant and supportable, with less nasty surprises.
For a real-world example: At a company I worked at previously, we had a general file & print server, and we ran Lotus Notes/Domino for groupware. The Lotus Notes installation for all users was hosted off a file share on the file & print server, and run directly from it. I believe this was originally done with the intention of being able to upgrade Notes once, and have all clients automatically update. Maybe at one point this worked.
The reality of the situation though was that a single network blip, maybe once a week, would kick every Notes user out of email, and generate a lock file on the share which would need to be manually deleted by an admin. People really notice when 'email is down'. The software loaded slowly also, especially first thing in the morning when you had 150 users concurrently trying to load off a single .exe. To top it all off, Notes upgrades still required visiting each PC. The net benefit ended up being zero or negative, although I presume it looked like a great way to save some time, originally.
As to your specific issue... what are you actually trying to achieve by doing this? If your .exe is one that's being created in-house and updated often, and your devs just want a quicker way of publishing their updates.. be careful. Swapping out an EXE while users are still accessing it can cause headaches and data inconsistencies. Also, loading apps on a terminal server should be done in a particular manner, using the set user /install command on the TS before installing, then set user /execute when installed and ready to run. Bypassing this process may lead to unpredictable results.
I have no problems with having executables on the file server. They don't execute there, they execute on the client, so pose no greater risk than any other file that might be on that server. Those files are after all simply being served as files, not as executable programs.
When it comes to terminal servers it's a different matter. The executables then run on the server, not the client. However, as terminal servers were designed for exactly that role there will be no issues beyond ensuring that any applications that need to be installed, as opposed to simply being copied, are installed using the correct methodology.
Depending on how the application has been built this can be a lot trickier than it seems, something that is just a single exe may need some help to allow it to run from a non-local drive. See this Stack Overflow Question on .Net Code Access Security for a discussion on one type of problem you may face.
There are some very capable application streaming solutions that will deliver the same ease of management benefits to you (Citrix XenApp can do this, VMware ThinApp ..). These provide manageable solutions for complex applications but at a cost. For simple apps that are well behaved your solution can be convenient but you will have to be careful about that well behaved part.