patternsqlModerate
SQL Server table with 4,000,000 rows is 40GB
Viewed 0 times
rowssqlwith000servertable40gb
Problem
I have been trying to figure out why our production database has grown so much recently and have found the cause to be an audit table. We are going to archive the table to fix the problem but I just wanted to check if the size of the table looked normal for a large SQL server table like this.
It has 4,000,000 rows with the following structure...
With a primary key on the AuditLogId
There is also a foreign key to the record that was updated on RecordId.
The nvarchar fields seem to have up to 12,0000 characters with an average of around 8000 characters.
The table is currently 40GB in size, does that seem right? I can provide more information on the table if that question can’t be answered with what I have posted above.
Thanks
It has 4,000,000 rows with the following structure...
Name Type Nullable
AuditLogId bigint no
UserName nvarchar no
TimeOfChange datetime no
ObjectName nvarchar no
ChangeName nvarchar no
RecordId int yes
OriginalValues nvarchar yes
ResultingValues nvarchar noWith a primary key on the AuditLogId
ALTER TABLE [dbo].[AuditLog] ADD CONSTRAINT [PK_AuditLog] PRIMARY KEY CLUSTERED
(
[AuditLogId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 80) ON [PRIMARY]
GOThere is also a foreign key to the record that was updated on RecordId.
The nvarchar fields seem to have up to 12,0000 characters with an average of around 8000 characters.
The table is currently 40GB in size, does that seem right? I can provide more information on the table if that question can’t be answered with what I have posted above.
Thanks
Solution
You have 4,000,000 rows and one of the columns averages 8,000 characters (16,000 bytes, I assume).
If your stats are accurate, I'd expect this table to be 61 GB. (I wonder if you mean 8,000 bytes, not 8,000 characters, in which case I'd expect > 30 GB.)
Other factors include:
SELECT CONVERT(bigint,4000000) * /* b */ 16000 / /*kb*/ 1024 / /*mb*/ 1024;
------
61,035If your stats are accurate, I'd expect this table to be 61 GB. (I wonder if you mean 8,000 bytes, not 8,000 characters, in which case I'd expect > 30 GB.)
Other factors include:
- data in other columns I didn't account for in the above calculation
- additional indexes
- fill factor > 0 and
- fragmentation and space still occupied by deleted rows or page splits
Code Snippets
SELECT CONVERT(bigint,4000000) * /* b */ 16000 / /*kb*/ 1024 / /*mb*/ 1024;
------
61,035Context
StackExchange Database Administrators Q#247347, answer score: 17
Revisions (0)
No revisions yet.