One thing I've done when I had to update a lot of live records and I wasn't concerned with rolling them back is to create an update that works in little chunks. Use whatever option your SQL system has to limit the number of rows updated/deleted, such as top in TSQL. Then stick the whole thing in a while loop that does the SQL, sleeps for a few seconds and then loops as long as their are any rows to process.

Here is one I used to delete out all the detail records for one very large contract from a live system. Slightly altered to protect the innocent. The loop breaks when it runs out of records because @@ROWCOUNT records how many records the last update/insert/delete affected in TSQL.

While (1=1)
Begin
delete top (10000) from spot where contract_id = 1234;
if @@ROWCOUNT = 0 break;
-- wait 10 second between loops to relieve congestion on the server
WAITFOR DELAY '00:00:10';
End

Not exactly the same problem. I did this because doing a delete more then a million rows at once was just clogging up the transaction log and slowing everything to crawl. However, with a bit of string manipulation and exec you could use this solution to scale your method up to the really big problems.

Jay