patternsqlMinor
Huge Heap Table and table compression on SQL Server 2016
Viewed 0 times
serversqlcompressionheaphugeandtable2016
Problem
My database size is huge, and recently I noticed that a new table added few months ago is the culprit.
Here is the table script.
I collected information on this table, and learned that the table has about 16 million rows and the table size is 1.4TB.
Still, I think that the table size is huge for the no. of records.
This table is not queried by the application. It just stores different versions of same entries from another table.
I checked the fragmentation information and it shows 0 average fragmentation with 93% average space used.
Since I am using SQL Server 2016, I though I could use table compression, so tried
However, the results shows no savings.
Can anyone direct me in finding the issue with this table?
It is really important, as this is taking a lot of disk space.
Appreciate your help.
Here is the table script.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Entry_tracker](
[S.Number] [int] IDENTITY(1,1) NOT NULL,
[EntryId] [varchar](50) NOT NULL,
[EventNumber] [varchar](18) NOT NULL,
[Data] [varbinary](max) NOT NULL,
[TrackDateTime] [datetime] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
ALTER TABLE [dbo].[Entry_tracker] ADD CONSTRAINT [DF_Entry_tracker_TrackDateTime] DEFAULT (getdate()) FOR [TrackDateTime]
GOI collected information on this table, and learned that the table has about 16 million rows and the table size is 1.4TB.
Still, I think that the table size is huge for the no. of records.
This table is not queried by the application. It just stores different versions of same entries from another table.
I checked the fragmentation information and it shows 0 average fragmentation with 93% average space used.
Since I am using SQL Server 2016, I though I could use table compression, so tried
sp_estimate_data_compression_savings to estimate possible space savings. However, the results shows no savings.
size_with_current_compression_setting(KB) is equal to size_with_requested_compression_setting(KB)Can anyone direct me in finding the issue with this table?
It is really important, as this is taking a lot of disk space.
Appreciate your help.
Solution
Still i think that the table size is huge for the no. of records.
Why? What makes you say this?
what it does is stores different version of same entry from another table.
So it holds versions, which could be large or small, especially given:
Have you checked the
Seems to me, based on the data the size it is given the function seems reasonable. Versioning is rarely small unless it's just changed bytes and you use various algorithms to understand which bytes, how, and if it builds off previously changed versions or not.
Why? What makes you say this?
what it does is stores different version of same entry from another table.
So it holds versions, which could be large or small, especially given:
[Data] varbinary NOT NULL,Have you checked the
datalength() on the rows to see if you have some large rows in there eating space? Seems to me, based on the data the size it is given the function seems reasonable. Versioning is rarely small unless it's just changed bytes and you use various algorithms to understand which bytes, how, and if it builds off previously changed versions or not.
Context
StackExchange Database Administrators Q#240590, answer score: 6
Revisions (0)
No revisions yet.