0

I have a small database which basically run a couple of reporting SPs. Data gets extracted from an OLTP database and refreshed everyday and most tables in this database get truncated everyday.

The Logic in one of the SPs is that it uses alot of update statements. I can see the transaction log grows by many GBs when it runs. The database is in simple recovery mode. So my diagnoses is that all the update statements are making it slow.

My question is that will changing the setting to delayed durability have an effect on the speed of the SP?

xhr489
  • 805
  • 8
  • 28
  • 1
    have you looked into memory optimized tables - with schema durability only - since you dont want the data i.e it gets truncated anyway ? Also, batch your updates – Kin Shah Feb 12 '21 at 21:22
  • @KinShah, by batching updates do you mean to put them in an explicit transaction? – xhr489 Feb 12 '21 at 21:24
  • Yes .. batch them with explicit transaction https://dba.stackexchange.com/a/43256/8783 e.g. u can choose a batch size of 5K or whatever works for you – Kin Shah Feb 12 '21 at 22:12
  • @KinShah, how to I measure the size of the batch? 5K of what? – xhr489 Feb 12 '21 at 22:26
  • @KinShah ahh you mean in a loop? – xhr489 Feb 12 '21 at 22:30

1 Answers1

2

My question is that will changing the setting to delayed durability have an effect on the speed of the SP?

If the stored procedure makes many small updates without a transaction then delayed durability might speed it up. But batching the inserts into a larger transaction would be a better idea.

David Browne - Microsoft
  • 46,306
  • 3
  • 46
  • 96