Click here to Skip to main content
15,887,952 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
Hi,

I am working in a project related to Insurance and Finance domain. It is a web based application based on EDMX. The platforms which we are using ASP.NET 4.5 and Sql Server 2008. We have many screens and there are lots of places from where duplicate records can get saved. We have put unique non-cluster indexes in database tables but not doing any concurrency handling explicitly.

I am not sure about the reason that why our technical directors has not implemented concurrency handling but I guess might be because of performance issues. We have lots of tables and there are lots of inserts, updates, deletes used to fire.

So is it like if we are doing concurrency handling explicitly, then we may face serious performance issues?


Thank you.

Regards,
Nivedita
Posted

If concurrency handling is done properly, I believe you won't be facing any performance problems.

The key thing in my opinion is to use optimistic concurrency handling instead of locking for longer periods of time. The technique for this is to use rowwversions.

For starters, have a look at for example Data versioning in SQL Server using row versions[^].

Also introducing RVV like identifier in the updating DML statements you can easily spot potential lost updates. This is something that won't be handled by unique constraints and locks when updates are sequential :)
 
Share this answer
 
Thanks for the response Mika... I would definitely dig more into your suggestion !!
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900