Tuesday, September 15, 2009

Use batches to delete large number of records in a table

Sometimes it can happen that you need to delete a lot of records in a table. This can happen for instance in log tables. With a simple DELETE statement like:

DELETE FROM MyTable WHERE COL1 = 1

it can take a while when you have for instance 1 million records to delete. It can results in a table lock which has a negative impact on the performance of your application.

As of SQL2005/2008 you can delete records in a table in batches with the DELETE TOP (BatchSize) statement.  This method has 3 advantages
  1. It will not create one big transaction.
  2. It avoids a table lock.
  3. If the delete statement is canceled, only the last batch is rolled back. Records in previous batches are deleted.
You can use next script to test this DELETE TOP () statement.

CREATE TABLE DEMO (COL1 INT,COL2 INT)


DECLARE @COUNTER INT
SET @COUNTER = 1

INSERT INTO DEMO (COL1,COL2) Values (2,2)

WHILE @COUNTER < 50000
BEGIN

INSERT INTO DEMO (COL1,COL2) Values (1,@COUNTER)
SET @COUNTER = @COUNTER + 1

END

/*
-- Show content of the table
SELECT COL1, COUNT(*) FROM DEMO GROUP BY COL1
*/

-- Deleting records in batches of 1000 records

DECLARE @BatchSize INT
SET @BatchSize = 1000

WHILE @BatchSize <> 0

BEGIN

DELETE TOP (@BatchSize)
FROM DEMO
WHERE COL1 = 1

SET @BatchSize = @@rowcount

END

-- SELECT * FROM Demo -- Now we have only 1 record left



7 comments:

Ronald Voets said...

Does this also work for UPDATE statements, so update 1 million records in batches of 1000?

André van de Graaf said...

Ronald, thanks for your question. No this does unfortunately not work for update statements.

aap said...

For an update statement you can use another table to insert from

Unknown said...

Hi,

Thanks for your post.
I wanted to know do you think we can extend the same technique when we have more than one tables from where we've to delete the data.
Regards,
Prabhakar

André van de Graaf said...

Prabhakar,

yes of course you can do.

Gr. André

Unknown said...

Thanks Andre.
I've tried the same but currently having trouble in getting it right.
Can you please help me with same?

Regards,
Prabhakar

issabella said...

When dealing with a large number of records, such as 1 million, deleting them can indeed be time-consuming and impact performance. Break the deletion into batches which wouod help reducing table locks and further and would be done shortly, so that can even get their university of sheffield assignment help online to get their tasks on track.