4

I have three SQL database instances on a cloud server: A, B and C and the Server Installed Physical memory (RAM) is 32GB. The size of A (data+log file) is 44GB and B (data+log file) is 41GB. We ignore C since it has no databases in it and max memory set to 500MB.

Both instances A & B has it's memory set to the default size: 2147483647 MB. Although we have had no issues this is definitely not practical.

After some research, based on database activity, i decided that instance A memory allocation should be set to 20GB and B set to 10GB. In total 30/32 of the max physical memory in the system. Below shows the actual memory consumption. I am afraid if i set the max memory it will slow down response time. Is this configuration optimal in setting the SQL Max Server Memory for such databases and how can i simulate before applying?

enter image description here

Raidenlee
  • 153
  • 2
  • 12

4 Answers4

4

A general recommendation by Jonathan (from SQLSkills)

reserve 1 GB of RAM for the OS, 1 GB for each 4 GB of RAM installed from 4–16 GB, and then 1 GB for every 8 GB RAM installed above 16 GB RAM.

You can use powershell dbatools Test-DbaMaxMemory to calculate for you. The dbatools has set-DbaMaxMemory to set the max memory for you across all your servers.

I would go with Test-DbaMaxMemory to check the recommendations first and once you are comfortable, use set to set it across all your servers.

Kin Shah
  • 62,225
  • 6
  • 120
  • 236
3

What you will see is that less data can be cached and, thus, some queries will have desegregated performance. The default value for RAM just, kind of, means that SQL Server will use 'whatever it can.' Once it goes up, it won't come back down until the service is restarted as it then reserves all that RAM (because it caches data on it.)

This is my favorite resource for determining how much RAM a server really needs. My only issue with it is that it doesn't really take in to account smaller VM based machines and just assumed that you CAN throw 128gb+ at a server.

https://www.brentozar.com/archive/2014/05/much-memory-sql-server-need/

A general rule of thumb is 10%, or 8GB, dedicated to OS/Auxiliary with the rest being dedicated to SQL Server (through the MAX Ram setting.) (10% or 8GB, whichever is more.)

Wes
  • 1,126
  • 1
  • 10
  • 31
  • i have my fingers crossed when i apply the memory change hopefully there will be no decrease in performance. I will check out the resource you provided. – Raidenlee May 09 '17 at 18:40
  • Do I need to add a disclaimer to my above comment? :D – Wes May 09 '17 at 19:09
  • 1
    No need, it's obvious when you decrease memory for SQL servers there would be a drawback of some sort :p – Raidenlee May 09 '17 at 19:12
2

In my opinion you're not leaving near enough for the operating system: on a 32GB system, I'd probably leave 4GB or so. Remember that local logins, SQL jobs and SSIS packages, and 3rd-party backup software and the like all use memory outside the allocated space for the SQL engine itself. If I know there are memory-hogging SSIS packages run regularly, I'll leave even more memory for the OS.

How you split the remaining between the 3 instances is up to you. There is no (serious) risk of impact by changing these on the fly, they don't even require a SQL restart to take effect.

Don't worry about a minor difference in your planned max size vs what you see in your graph there. SQL is a hog, and will use as much memory as you give it. If you reduce the max, it will have to release some memory, but it's just giving up a portion of its local data cache. It may have to go back to disk slightly sooner for some queries, but I doubt you'll notice a difference.

EDIT: After some discussion in the comments, it doesn't appear a SQL restart is necessary, even when the MaxMem is reduced below the currently used amount.

BradC
  • 9,964
  • 8
  • 48
  • 88
  • 1
    @Wes You're talking about requiring a SQL restart to take effect? I've never had to do so, but I also wasn't watching the "used memory" counter particularly closely. I'd love to see some documentation on that. – BradC May 08 '17 at 19:52
  • 1
    Not true, can be done online. Remember, SQL uses the cache and if fully utilized you might see perf issues, lower PLE or BCHR. – Henrico Bekker May 08 '17 at 19:54
  • I'd have to look around for something to confirm but... if SQL is taking 28GB RAM, and you reduce the allocation down to, say, 20Gb, a service restart would be required BECAUSE SQL Server will never let go of consumed memory space. This post is in agreement, but is not an official doc: https://social.msdn.microsoft.com/Forums/en-US/7e245bab-b694-4db1-b148-bb31d8b20cea/sql-server-memory-setting-restart-required?forum=sqltools – Wes May 08 '17 at 19:55
  • 1
    https://www.sqlskills.com/blogs/glenn/eight-different-ways-to-clear-the-sql-server-plan-cache/ – Henrico Bekker May 08 '17 at 19:59
  • 1
    I stand corrected... It WILL reduce, but may take a long time to do so:https://dba.stackexchange.com/a/121483/15787 – Wes May 08 '17 at 19:59
  • 1
    @HBK I suppose there are multiple methods then, to reduce it... Naturally over a longer period of time, clearing things like plan cache or buffer cache manually (through SQL commands), or restarting the instance service. – Wes May 08 '17 at 20:01
  • 1
    @Wes Interestingly, the DOC that Pradeep links to in that answer seems to contradict him: "When SQL Server is using memory dynamically, it queries the system periodically to determine the amount of free memory.... If less memory is free, SQL Server releases memory to the OS." – BradC May 08 '17 at 20:02
  • 1
    @BradC Yeah, I saw that too after. Paul White has a good answer too, about it: https://dba.stackexchange.com/a/115277/15787 – Wes May 08 '17 at 20:03
  • 1
    Cool. Good initial question, I've added the link you found to my answer, so as to render this comment thread redundant. – BradC May 08 '17 at 20:07
  • Thanks for the replies. I will apply the sql memory change but without restart so i can monitor the memory use for about a month or two. Memory allocation: 14GB memory for both instances and 8GB left for the system. Last thing i want is for users to complain the application is running really slowly. – Raidenlee May 09 '17 at 18:38
  • @Rocklee 14GB x 2 + 8GB = 36GB?? 8GB seems like more than the OS should really need, unless you really know you have some memory-hogging SSIS packages or something. – BradC May 09 '17 at 18:50
  • @BradC it was supposed to be 4GB not 8GB. Nope we ensured that SSIS and other reporting packages were not installed on the production server. – Raidenlee May 09 '17 at 18:54
0

Have you done analysis to see how much actual memory each instance requires? Avoid using Task Manager to determine SQL memory requirements, there are proper perfmon metrics to use for SQL memory usage.

Henrico Bekker
  • 840
  • 4
  • 10