Once every 2 to 3 months the client sends us a giant SQL script to update their web database with changes. The script fails when you try to run it via SQL Server Management Studio due to insufficient memory.
We run the query using SQL Query Analyzer/ Mgmt studio.
Error message: Server: Msg 701, Level
17, State 1, Line 64302
There is insufficient system memory to run this query
Same problem in both SQL 2000 and SQL 2005.
The script contains a series of DELETE and INSERT statements for multiple tables.
The file can be as big as 57973 KB.
There are over 12000 objects in the database not to mention look up tables and join tables. The media table has 65720 records and another table has 97799 records.
We have no way of changing the SQL export we receive from the client.
There really isn't much to be done here besides either (a) increasing your available RAM or (b) shrinking the query size. If this is all happening inside one big giant transaction you could really be in for a world of pain, especially if the data sets being changed are large (many columns, BLOBs, etc.).
This sounds like a case where you need to sit down with your client and discuss alternatives to slim down the change sets you're loading - perhaps a monthly/weekly/daily update instead of every 2-3 months?
This is a server side out of memory. Given that the error mentions
Line 64302
I would guess that the script is a single batch (no GO delimiters). This could explain the problem. It should be fairly trivial to split the received script into several batches, simply insertGO
where appropriate.Have you tried executing the script through the OSQL command. That should eliminate much of the overhead of using SSMS. That would at least isolate if it is a database problem or a problem with the way that SSMS is handling the query.
Just a thought.