11

One of our customers has just upgraded to a new server.

For a particular stored procedure the first time you execute it, is taking over three minutes to run. Subsequent runs are less than 1 second.

This leads me to believe that the initial three minutes are primarily taken up calculating the execution plan. Subsequent runs then just use the cached plan and run instantaneously.

On our test databases it is taking about 5 seconds to calculate the plan for the same procedure.

I don't see anything terrible in the plan itself - although I don't imagine its relevant as the plan shows how long it takes to run the query, not calculate itself.

The server is a 16 core with 24 gb memory. No heavy CPU or memory load occurs.

What is it that could be causing such a slow calculation only on a particular database?

What steps can I take to find the cause of the problem?

Edit

So I have managed to access the server and have run the query with SET SHOWPLAN_XML ON.

I can confirm that the CompileTime for the query is taking up 99% of the query execution time. The StatementOptmEarlyAbortReason is "TimeOut", on our test database with a copy of their database the reason is MemoryLimitExceeded.

Mongus Pong
  • 581
  • 4
  • 17
  • 1
    Are you sure its the plan compilation that's taking a long time, not just reading data into cache from disk? – Mark Storey-Smith Feb 01 '12 at 11:10
  • 1
    Compile Time does show up in the plan XML itself. Have you identified it is definitely compile time and not something else e.g. synchronous auto update of statistics? What version of SQL Server are you on? – Martin Smith Feb 01 '12 at 11:14
  • @MarkStorey-Smith Well I'm not really sure about anything. I do know that running DBCC FREEPROCCACHE, which only clears out that plan cache, will take the next query time back up to 3 minutes. – Mongus Pong Feb 01 '12 at 11:16
  • @MartinSmith Ah yes very useful, I have been using the visual query plan. Yes the compile time is roughly the entire execution time of the query. We are on SQL Server 2008 R2. – Mongus Pong Feb 01 '12 at 11:27
  • @MongusPong - If you create a statistics only copy of the production database can you reproduce this in your test environment? What does the plan say for StatementOptmEarlyAbortReason? – Martin Smith Feb 01 '12 at 11:29
  • @MartinSmith I have managed to get a full copy. Execution time is taking only 39 seconds. The procedure is encrypted, so to get the plan I had to alter the procedure to remove the encryption - this didn't change the execution time significantly. StatementOptmEarlyAbortReason is "MemoryLimitExceeded". – Mongus Pong Feb 01 '12 at 12:12
  • @MongusPong - MemoryLimitExceeded occurs in your test environment, production or both? What is CompileMemory in both? – Martin Smith Feb 01 '12 at 12:14
  • @MartinSmith - test environment. I hope to have a direct look at the production database shortly. – Mongus Pong Feb 01 '12 at 12:16
  • @MongusPong - Presumably that means that in your test environment compile time would take longer too except it runs out of memory first. What does the query look like? – Martin Smith Feb 01 '12 at 15:06

1 Answers1

11

I hate to answer my own question, especially since I have had so much help from others to work out the solution, but here goes.

The problem was due to some dud statistics in the database. Looking at the execution plan, the optimiser was expecting 11.5tb of data returned from the query. In reality is was receiving 87kb. I now know that huge mismatches between expected and actual rows returned is a sign that the statistics are out of date.

Simply running

exec sp_updatestats

forces the database to update the statistics for all tables.

This has taken the query execution time down from 3 minutes to 6 seconds. Everyones a winner!

Thanks for all the help guys. :0)

Mongus Pong
  • 581
  • 4
  • 17