We have an MSSQL database in which all the primary keys are GUIDs (uniqueidentifiers). The GUIDs are produced on the client (in C#), and we are considering changing the client to generate sequential (comb) GUIDs instead of just using Guid.NewGuid(), to improve db performance.
If we do this, how will this affect installations that already have data with "random" GUIDs as clustered PKs? Can anything be done (short of changing all the PK values) to rebuild the indexes to avoid further fragmentation and bad insert performance?
Please give explicit and detailed answers if you can; I am a C# developer at heart and not all too familiar with all the intricacies of SQL Server.
Thanks!
Quick comment: GUID + Clustered Index? This sounds bad - Clustered index means the table is physically on the disk in the order of the index - so an insert has to potentially rewrite page(s) of the database to be in order. Change this to a nonclustered index and performance will improve.
Sequential GUIDs have plenty of problems, they really aren't guaranteed unique if sequential, somewhere if you scale a piece to running in multiple places the sequence is duplicated potentially.
Try dropping your clustered index and creating the same index nonclustered - You'll almost certainly fix any problems - The random GUID is NOT a performance problem, Guid.NewGuid takes no time (PLENTY of other things will slow down before that routine is a problem).
Clustered Indices aren't perfect for everything.