Wednesday, March 25, 2009

Simple Database Tuning for SQL Server

Simple Database Tuning for SQL Server 2000:
When you Google for database slowness issues, you get plenty of hits to give you grey hair reading through all of them. After going through a number of them, and reading through some great SQL Server books, here is another addition to that lock of grey hairs!
Most of the articles that I have read online assume that you have taken care of the basic stuff, and now hunting to further fine tune your installation. But I am not going to focus on “fine” tuning; rather on the basics.

I agree that SQL Profiler is a very powerful tool, and I am a big fan of it myself. But before you go near it, you should have some basics covered. So here goes nothing:

1. Primary Keys are a must: You will be surprised as to how many “underground” installations are out there that don’t even follow this basic rule. Then these self taught DBAs run the SQL Tuning Advisor tool, which asks you to build a non-clustered index on a column which is supposed to be the primary key! So, please, make sure ALL your tables have primary keys. (FYI: A table that does not have a defined primary key is technically called a heap)

2. 80-20 Rule: 80% of the database problems lie with 20% of the tables! SQL Server is a very robust piece of software and will take abuse from bad design, lack of indexing etc up to a limit. Then, it just grinds to an agonizing halt. To spare it from this fate, run the following set of command:

DBCC UPDATEUSAGE('{your database name here}')

(DbTableSizesId INT IDENTITY (1,1)
, TableName varchar(100)
, [Rows] BIGINT
, Reserved varchar(50)
, Data varchar(50)
, IndexSize varchar(50)
, Unused varchar(50))

EXEC sp_MSforeachtable @command1='EXEC sp_spaceused ''?'''

SELECT * FROM #DbTableSizes

DROP TABLE #DbTableSizes

This will give you a list of the tables that are most likely to be in the 20% that are causing problems. Those with the most number of rows are the ones to focus on. If you have tables with large amount of data (meaning large number of columns), but not really a huge number of rows, focus on those as well.

3. Now start targeting tables in this list, one by one. Make sure these have primary keys defined.

4. Using the following query, find out what stored procedures use these tables. You can also use the Action -> All tasks -> Display Dependencies menu option for this.
FROM syscomments
WHERE [text] LIKE '%{table name here}%'
AND OBJECTPROPERTY(id, 'IsProcedure') = 1

5. Now, from within each of the stored procedures and functions, get the queries that are using the table under review. You should create a list of all unique WHERE clauses used on this table.

6. For each of those WHERE clauses, look at the Query Plans, to check none of the following evils are present:
a. Table Scans
b. Clustered Index Scans
c. Parallelism, unless you know you have multiple CPU’s on your server
d. Bookmark Lookups, unless you know exactly why this is happening

7. Modify/Build indexes on the table so that you rid the query plans of the above evils.

This sounds like a very tedious process, and it is. But the benefits are pretty big. You will see a significant improvement in database performance.
If you want to take it a step further, try to build a script that will do all the data collection (Steps 1 through 6) for you and you can review the report periodically. (Maybe I will work on that as a side project)

Advanced Database Tuning for SQL Server 2000:
Once you have taken care of the basics, let us suppose you still find that the database performance is slow. I will point you to a very interesting methodology by Itzik Ben-Gan in the book, Inside SQL Server 2005: T-SQL Querying, chapter 3.
I have used most of the techniques in this chapter and they have helped me to dig much deeper into the slowness/high CPU usage issue that we experienced with one of our production installations.