An Opportunity I Couldn’t Refuse

When I started working as an independent consultant two years ago, I had this idea that I could work mostly in the winter and take lots of time off in the summer. Well, things don’t always work out as planned. So far, I have much more client work during the summer months.

During the winter, I don’t waste time, though. I always have presentations to prepare, blog posts to write, Pluralsight courses to record, and newsletters to write. I stay productive no matter what.

In addition to work for clients this summer, I have been working on several projects. The book I started writing in January will come back to me for one last review in a few days, and I am tech editing another book. I just started working on my third Pluralsight course. I volunteer for the LaunchCode organization as a CoderGirl mentor and am co-leader of the PASS Women in Technology Virtual Chapter.

A couple of weeks ago, I looked at everything I had going on, including my speaking schedule. I have cut back on SQL Saturdays this year due to time and the travel expense, but I still have two coming up as well as PASS Summit and Dev Up. I made the difficult decision to not accept any additional speaking engagements through mid-November and maybe through the end of the year.

Again, things don’t always work out as planned. Just days after my decision, I received a message from PASS HQ telling me that I had been chosen as a speaker for 24 Hours of PASS, Summit Preview. This was completely unexpected and quite an honor that I just couldn’t pass up. OK, one more presentation it is.

My 24HOP session, Indexing for Beginners, will explain clustered and nonclustered indexes and why the right indexes make queries go faster. Be sure to check it out as well as all the other great sessions.

Posted in Uncategorized | Leave a comment

Memory is the Air that SQL Server Breathes

Hot air balloons landing in a mountain Cappadocia Goreme National Park Turkey.

Many, many companies that are using SQL Server don’t have anyone on staff who really understands SQL Server. They often think of SQL Server as just the place where the data is stored and don’t understand how it should be configured or maintained. The teams at these companies are fantastic at what they do, but they just don’t have a DBA. Maybe they don’t even need one full time, but having someone like me on a part time basis can really help.

You may have seen that I give a presentation called the “Top 5 Mistakes You Are Probably Making with SQL Server.” I have finally decided that there is one mistake that outweighs the others: insufficient RAM.

In order for SQL Server to read and update data, the data must be in the buffer. SQL Server does not work directly with the data in the files on disk. Once the pages of data are in the buffer, they can be used for multiple queries. This means that the data doesn’t have to be retrieved from disk every time it’s needed, thereby decreasing the amount of I/O work required.

You may have seen this yourself when selecting all the rows of a large table twice. The second time, the query can run much faster because the data does not have to be copied from the disk to the buffer. If you run another query from a different large table, it may cause the pages from the first table to be removed to make room. If there is not enough memory, pages will have to be read from disk more frequently causing your queries to be slow.

When I have my first meeting with a new customer they tell me “Our application is slow. It’s been getting worse over time. We think the problem is with SQL Server, but we don’t know where to look.” One of the first things I check is how long the data pages can live in the buffer. I check this by looking at a performance counter called Page Life Expectancy or PLE. This is the number of seconds that a page is expected to last in memory. Unfortunately, the published value for this by Microsoft is 300. In today’s servers, the value should be much higher. I recently read a post from Jonathan Kehayias (@SQLPoolBoy) who said the minimum value should be 300 times the number of GB in the data cache divided by 4. To make it easier, just take the number of GB installed on the server or, if you have it configured properly, the max memory setting of the instance. Obviously, both of those number will be higher than the data cache size, but close enough. If you have multiple instances on a server, those instances will have to share the RAM, so keep that in mind as well.

I like to review the PLE values in a graph to see what is happening over time. (Take a look at this article by Linchpin People partner Mike Walsh (@mike_walsh) for an easy way to collect the data and create the graph by using the PAL tool.) If I see the PLE bottoming out or zigzagging during production hours, I know there is a problem with insufficient RAM. Occasional dips below the value calculated in the previous paragraph are fine, but we don’t want PLE to stay there. Here is an actual graph from a customer generated by using PAL on a server with 16 GB of RAM over an eight hour period.

bad ple

How can this problem be solved? There are really two aspects to the solution. The first is usually easy: increase the RAM. By having adequate RAM on the server, the data pages can live in the buffer longer and decrease I/O requirements. Performance can improve immediately.

The second part of the solution is more difficult and sometimes not possible: query and index tuning. By optimizing indexes and queries, a smaller number of pages are required by queries. This decreases both I/O and memory requirements.

The next time you hear someone complaining about how much memory SQL Server needs, just remember that memory is like air to SQL Server. 

 

Posted in Database Administration, SQL Server Administration | 3 Comments

Nebraska Code() Conference Slides and Code

I am busy presenting at Nebraska Code() Conference this week. I will post slides and code from my sessions here.

Wednesday: T-SQL Window Function Deep Dive
Thursday: The Top 5 Mistakes You are Probably Making with SQL Server
Friday: Introduction to Window Functions

Posted in Life in IT | Leave a comment