HiveBrain v1.2.0
Get Started
← Back to all entries
patternsqlMinor

Could an update on all statistics separately cause bad performance?

Submitted by: @import:stackexchange-dba··
0
Viewed 0 times
separatelyupdateallbadcouldstatisticsperformancecause

Problem

We’ve been utilizing Ola Hallengren's maintenance scripts for our index optimization for years on our OLTP databases. (we are on sql server 2008 R2)
2 weeks ago though one of the DBA’s here has created a maintenance job to update statistics separately and this runs weekly before the Index optimization job.
It seems that after this was created we are getting performance issues on a handful of stored procedures. I am not too familiar with stat maintenance, so I am wondering, is it really necessary to run a weekly job separately to update all statistics?

The DBA here seemed to think this was needed but I feel that we started having badly performing stored procedures the morning after the maintenance jobs are running and I constantly needed to recompile the same stored procedures at least 1 or 2 times per day for the following few days then the issue just stops.
My theory is pointing to the statistics updates that occurred before but I am having trouble finding the reason why. Don't statistics get updated after a certain amount of changes or during an index rebuild anyway? Why would recalculating them separately cause an issue?

Would anyone be able to offer some clarity as to why this would happen to the same stored procedures every time statistics get updated?

This is the command being used for the statistics job:

EXECUTE dbo.IndexOptimize 
    @Databases = 'DB_PRO', 
    @FragmentationLow = NULL, 
    @FragmentationMedium = NULL, 
    @FragmentationHigh = NULL, 
    @UpdateStatistics = 'ALL', 
    @OnlyModifiedStatistics = 'Y' , 
    @TimeLimit = 3600, 
    @LogToTable = N'Y';


Then the Index Maintenance job runs with the following :

sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[IndexOptimize] 
    @Databases = 'DB_PRO', @TimeLimit = 10800, @LogToTable = 'Y'" -b

Solution

Question: I am not too familiar with stat maintenance, so I am wondering, is it really necessary to run a weekly job separately to update all statistics?

Yes, you need to maintain your statistics if you have large tables. Read here how Auto update statistics works:


Auto Update Statistics Option Statistics are checked before query
compilation or before executing a cached query plan. Statistics are
considered out-of-date when:


There was a data change on an empty table. The number of rows in the
table was 500 or less at the time of statistics creation and the
column modification counter of the leading column of the statistics
object has changed by more than 500 since then.


The table had more than 500 rows when the statistics were gathered,
and the column modification counter of the leading column of the
statistics object has changed by more than 500 + 20% of the number of
rows in the table when the statistics were gathered.


A table in TempDB with less than 6 rows had at least 6 row
modifications.

Question: Don't statistics get updated after a certain amount of changes or during an index rebuild anyway?


Rebuilding an index, for example by using ALTER INDEX … REBUILD will
also update index statistics with the equivalent of using WITH
FULLSCAN unless the table is partitioned, in which case the statistics
are only sampled* (applies to SQL Server 2012 and later). Rebuilding
indexes does not update column statistics.


Reorganizing an index, for



example using ALTER INDEX … REORGANIZE does not update any statistics.


Question: Why would recalculating them separately cause an issue?

Might be because your sample rate is to low and the statistics is not accurate. Use FULLSCAN to see if that helps. It might also be that the Cardinality Estimation misinterprets the statistics (can happen in SQL Server 2014), if this is the case try trace flag 9481 to disable cardinal estimator in SQL Server 2014.

The StatisticsSample parameter of Ola Hallengrens script give you this:


Indicate, as a percentage, how much of a table is gathered when
updating statistics. A value of 100 is equivalent to a full scan. If
no value is specified, then SQL Server automatically computes the
required sample.

Context

StackExchange Database Administrators Q#184096, answer score: 5

Revisions (0)

No revisions yet.