Click here to Skip to main content
15,559,287 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
See more:
I have a table employee which has 4 columns ID, Name, Salary and Designation.


In this table I have 25 lakhs records, I wanted to update the designation column based on condition where salary > 25k

How can update the table in more efficient way instead of writing an update query

Writing update query is.making transaction log full and is more time consuming

What I have tried:

Update employee set Designation='SSA, where salary >25
Posted
Updated 30-Nov-22 18:37pm
Comments
0x01AA 28-Nov-22 14:44pm    
I think the key question here is, do you rely on the log?
If so, you have to live with it
If no: you have to check the backup settings.
Maciej Los 28-Nov-22 14:46pm    
Your query is wrong.
0x01AA 28-Nov-22 15:04pm    
That is only a simple typo ;)
But it is a known problem in MS SQL that trans log grows in case of updates which affects a lot of rows.

You have no choice but to write an update query.

The probable problem with the transaction log is the drive it's on is full or very close to full. You have to a little transaction log maintenance to fix that.

sql server transaction log full - Google Search[^]
 
Share this answer
 
Comments
Richard Deeming 29-Nov-22 4:12am    
Based on his question last week, you're spot on: his drive is full.
The transaction log for database 'a_TEST' is full due to 'ACTIVE_TRANSACTION'[^]
Dave Kreskowiak 29-Nov-22 7:59am    
Oh joy, another one who doesn't listen. What are the odds the question will show up again some time this week?
For updating large volumes of data, there are ways to optimize.
1. Replace the update statement with bulk-insert statement.
2. Remove the index on the updating column if any. Once the update completes, index can be created.
3. Deletetriggers will extremely slow down the updates, disable them if any.
You can see the execution plan of the update statement also and check for the reasons where it is slowing down.

Hope this helps.
 
Share this answer
 
Comments
Richard Deeming 1-Dec-22 4:39am    
Bulk insert isn't going to help with an update - the clue is in the name!

Also, the problem the OP is having is that he has run out of disk space. Rather than investigating and resolving the actual problem, he keeps reposting his question in the hopes that someone will provide a magic fix that lets him update his database without using any disk space to do it.
Vert12 1-Dec-22 7:58am    
i have resolved the issue with the transaction logs and that is the reason i left the other question and posted a new question to see if there is an easy way to update .now i can update without transaction log getting full issue with disk space but update query consumes more time also transaction log is getting bumped up each time, so i was looking for an efficient way of updating. It is your assumption that i am reposting the same question which is 2 different cases as per my consideration.
Moreover the alternative solution for my updating is, i am creating a new temp table and inserting from old table to new temp table with my desired values , renaming the new temp table to old table name

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900