snippetsqlModerate
How can I prove that the following T-SQL is bad for performance?
Viewed 0 times
canthesqlbadfollowingprovethatforperformancehow
Problem
I have recently inherited a codebase with a large amount of Stored Procedures. The system they are supporting is encountering numerous performance problems which I am looking in to.
A number of the Stored Procedures have a pattern like this:
-
Build up dynamic SQL query to insert a bunch of records, e.g.
-
Execute this dynamic sql
-
Select from the temp table and bring in a bunch of additional information to return to the client.
My immediate thoughts are:
I have re-written some of the Stored Procedures in this way instead:
From comparing execution plans and client statistics in SQL Management Studio, I have not sped the Stored Procedures up so I am apprehensive about suggesting wholesale re-writes of all Stored Procedures.
I am trying to set up some profiling of a live customer scenario, but as yet have been unable to prove my thoughts.
Can anyone offer any confirmation of the theory behind my thoughts, or any better ways of proving my suspicions?
The problem is I have read that dynamic SQL is not always a closed case - i.e. it depends on how it is used. My understanding of lock
A number of the Stored Procedures have a pattern like this:
- Create Temp table
-
Build up dynamic SQL query to insert a bunch of records, e.g.
DECLARE @sql VARCHAR(MAX)
SET @sql = 'INSERT INTO @tempTable
SELECT SomeColumn, SomeColumn2, SomeColumn3, etc FROM MyTable'
IF @someParam = [SomeValue]
SET @sql = @sql + 'WHERE SomeColumn = [SomeValue]';
IF @someOtherParam = [SomeOtherValue]
SET @sql = @sql + 'WHERE SomeOtherColum = [SomeOtherValue]';-
Execute this dynamic sql
EXEC(@sql);-
Select from the temp table and bring in a bunch of additional information to return to the client.
SELECT
...
FROM
@tempTable
INNER JOIN ...My immediate thoughts are:
- There is dynamic SQL, so no cache plans, meaning plans generated every time.
- There is an
INSERTSELECTpattern, so table locking is more likely to be an issue.
I have re-written some of the Stored Procedures in this way instead:
SELECT
...
FROM
MyTable
INNER JOIN ...
WHERE
(
@someParam != SomeValue
OR
SomeColumn = SomeValue
)
AND
(
@someOtherParam != SomeOtherValue
OR
SomeOtherColumn = SomeOtherValue
)From comparing execution plans and client statistics in SQL Management Studio, I have not sped the Stored Procedures up so I am apprehensive about suggesting wholesale re-writes of all Stored Procedures.
I am trying to set up some profiling of a live customer scenario, but as yet have been unable to prove my thoughts.
Can anyone offer any confirmation of the theory behind my thoughts, or any better ways of proving my suspicions?
The problem is I have read that dynamic SQL is not always a closed case - i.e. it depends on how it is used. My understanding of lock
Solution
There is dynamic SQL, so no cache plans, meaning plans generated every time
Not necessarily true. Dynamic SQL can (and does) use cached plans just as well as static SQL. For dynamic search conditions resolving to dynamic SQL is oft the right answer. See Dynamic Search Conditions in T-SQL for more details.
There is an INSERT SELECT pattern, so table locking is more likely to be an issue.
Not necessarily true, specially with a
have re-written some of the Stored Procedures in this way instead
Using multiple
Can anyone offer ... any better ways of proving my suspicions?
Yes. Measure. Use a methodology like Waits and Queues. Don't relly on your intuition. Find the bottlenecks and address them accordingly.
Not necessarily true. Dynamic SQL can (and does) use cached plans just as well as static SQL. For dynamic search conditions resolving to dynamic SQL is oft the right answer. See Dynamic Search Conditions in T-SQL for more details.
There is an INSERT SELECT pattern, so table locking is more likely to be an issue.
Not necessarily true, specially with a
@tempTablehave re-written some of the Stored Procedures in this way instead
Using multiple
OR conditions like that is an anti-pattern. You are forcing the query optimizer to come up with a plan that works for any value of all those parameters. Usually the only solution is a scan, ignoring any index. The original code was better.Can anyone offer ... any better ways of proving my suspicions?
Yes. Measure. Use a methodology like Waits and Queues. Don't relly on your intuition. Find the bottlenecks and address them accordingly.
Context
StackExchange Database Administrators Q#49947, answer score: 17
Revisions (0)
No revisions yet.