patternsqlModerate
Claiming disk space after removing table field
Viewed 0 times
afterfieldspaceclaimingremovingdisktable
Problem
I am running sql 2008 r2 and the db was working fine and fast for last 3 years untill about 3 months ago we added ntext field on very active and used table.
Now we are starting to get out of server space because of the huge expanding size of this table.
I read that shrinking ,we do not want to loose the indexing of db because it was working fast for years and we do not want to get fragmentation expending.
We decided to delete that field and all its values:
Is there a way to delete the ntext field and all its values and release space without removing indexing ,without shrinking, without loosing db performance?
I am attaching the db size query output to show you size expanding of last 5 months.
Now we are starting to get out of server space because of the huge expanding size of this table.
I read that shrinking ,we do not want to loose the indexing of db because it was working fast for years and we do not want to get fragmentation expending.
We decided to delete that field and all its values:
Is there a way to delete the ntext field and all its values and release space without removing indexing ,without shrinking, without loosing db performance?
I am attaching the db size query output to show you size expanding of last 5 months.
Solution
We decided to delete that field and all its values: Is there a way to delete the ntext field and all its values and release space without removing indexing ,without shrinking, without loosing db performance?
I would recommend to use (from BOL : )
DBCC CLEANTABLE reclaims space after a variable-length column is dropped. A variable-length column can be one of the following data types: varchar, nvarchar, varchar(max), nvarchar(max), varbinary, varbinary(max), text, ntext, image, sql_variant, and xml. The command does not reclaim space after a fixed-length column is dropped.
!! CAUTION !! (use a careful batch size - its advisable to use this parameter if your table is massive):
DBCC CLEANTABLE runs as one or more transactions. If a batch size is not specified, the command processes the whole table in one transaction and the table is exclusively locked during the operation. For some large tables, the length of the single transaction and the log space required may be too much. If a batch size is specified, the command runs in a series of transactions, each including the specified number of rows. DBCC CLEANTABLE cannot be run as a transaction inside another transaction.
This operation is fully logged.
A simple repro will prove that
I would recommend to use (from BOL : )
DBCC CLEANTABLE
(
{ database_name | database_id | 0 }
, { table_name | table_id | view_name | view_id }
[ , batch_size ]
)
[ WITH NO_INFOMSGS ]DBCC CLEANTABLE reclaims space after a variable-length column is dropped. A variable-length column can be one of the following data types: varchar, nvarchar, varchar(max), nvarchar(max), varbinary, varbinary(max), text, ntext, image, sql_variant, and xml. The command does not reclaim space after a fixed-length column is dropped.
!! CAUTION !! (use a careful batch size - its advisable to use this parameter if your table is massive):
DBCC CLEANTABLE runs as one or more transactions. If a batch size is not specified, the command processes the whole table in one transaction and the table is exclusively locked during the operation. For some large tables, the length of the single transaction and the log space required may be too much. If a batch size is specified, the command runs in a series of transactions, each including the specified number of rows. DBCC CLEANTABLE cannot be run as a transaction inside another transaction.
This operation is fully logged.
A simple repro will prove that
DBCC CLEANTABLE is better than SHRINKING (and no worry of fragmentation :-)-- clean up
drop table dbo.Test
-- create test table with ntext column that we will drop later
create table dbo.Test (
col1 int
,col2 char(25)
,col3 ntext
);
-- insert 1000 rows of test data
declare @cnt int;
set @cnt = 0;
while @cnt < 1000
begin
select @cnt = @cnt + 1;
insert dbo.Test (
col1
,col2
,col3
)
values (
@cnt
,'This is a test row # ' + CAST(@cnt as varchar(10)) + 'A'
,REPLICATE('KIN', ROUND(RAND() * @cnt, 0))
);
end--drop the ntext column
ALTER TABLE dbo.Test DROP COLUMN col3 ;--reclaim the space from the table
-- Note that my table is only having 1000 records, so I have not used a batch size
-- YMMV .. so find a maintenance window and you an appropriate batch size
-- TEST TEST and TEST before implementing in PROD.. so you know the outcome !!
DBCC CLEANTABLE('tempdb', 'dbo.Test') ;Code Snippets
DBCC CLEANTABLE
(
{ database_name | database_id | 0 }
, { table_name | table_id | view_name | view_id }
[ , batch_size ]
)
[ WITH NO_INFOMSGS ]-- clean up
drop table dbo.Test
-- create test table with ntext column that we will drop later
create table dbo.Test (
col1 int
,col2 char(25)
,col3 ntext
);
-- insert 1000 rows of test data
declare @cnt int;
set @cnt = 0;
while @cnt < 1000
begin
select @cnt = @cnt + 1;
insert dbo.Test (
col1
,col2
,col3
)
values (
@cnt
,'This is a test row # ' + CAST(@cnt as varchar(10)) + 'A'
,REPLICATE('KIN', ROUND(RAND() * @cnt, 0))
);
end--drop the ntext column
ALTER TABLE dbo.Test DROP COLUMN col3 ;--reclaim the space from the table
-- Note that my table is only having 1000 records, so I have not used a batch size
-- YMMV .. so find a maintenance window and you an appropriate batch size
-- TEST TEST and TEST before implementing in PROD.. so you know the outcome !!
DBCC CLEANTABLE('tempdb', 'dbo.Test') ;Context
StackExchange Database Administrators Q#117391, answer score: 14
Revisions (0)
No revisions yet.