I was executing the script I found on this site to mark all my tables and stored procedures for recompilation.
(I changed it to also mark the stored procedures)
I have a Windows Forms (.net 3.5) application which uses a lot of data tables and after executing this script the time it took to open a data heavy form went from 8 seconds to just 4 seconds.
Do I have to run such a script regularily? I thought SQL Server would take care of compiling procedures and such.
BTW: Should this question be on SO? I thought the DBAs would be here on SF.
From Microsoft :
"As a database is changed by such actions as adding indexes or changing data in indexed columns, the original query plans used to access its tables should be optimized again by recompiling them. This optimization happens automatically the first time a stored procedure is run after Microsoft SQL Server is restarted. It also occurs if an underlying table used by the stored procedure changes. But if a new index is added from which the stored procedure might benefit, optimization does not happen until the next time the stored procedure is run after Microsoft SQL Server is restarted. In this situation, it can be useful to force the stored procedure to recompile the next time it executes
Another reason to force a stored procedure to recompile is to counteract, when necessary, the "parameter sniffing" behavior of stored procedure compilation. When SQL Server executes stored procedures, any parameter values used by the procedure when it compiles are included as part of generating the query plan. If these values represent the typical ones with which the procedure is called subsequently, then the stored procedure benefits from the query plan each time it compiles and executes. If not, performance may suffer."
Consider that the use of the word compile differs in SQL Server than it does, in say, a normal application. In a normal app the code is compiled to a machine language ready state so it doesn't have to be interpreted at run time. SQL Server's use of compile is to determine the best execution plan to perform the query. Since data has the potential to be constantly changing, SQL Server may have to make new determinations based on the indexes that are available, the quality of the statistics, etc., and all these can affect the execution plan. As a result, whenever a stored procedure is executed, SQL Server makes a determination to see if it can re-use an exisiting execution plan for that query or if it needs to generate a new one. Of course, if no plan exists, it'll generate a new one.
This, BTW, is not very different from what .NET apps do, if you think about it. .NET apps are compiled to an intermediate language state, but only fully compiled by default at the first run time.
When the SP is compiled, its execution path is fixed. It cannot take advantage of new indexes, or different strategies based on table sizes. Recompiling allows it to re-evaluate the execution conditions.
I'm not vastly experiences here - but a rule of thumb is to let simple SPs recompile each time. Only set no compile on the more complex cases