Click here to Skip to main content
15,890,506 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more: , +
Hello All,

I am using SQL server 2008.
I have a window service which is used to insert / update records in a table.
I am calling the stored procedure for each record.
My table has a column named ID [uniqueidentifier] and my stored procedure is like something like this. This ID should remain same for the updated records and for a new record I am generaing a newid.

SQL
CREATE Proc NAME_XXXXXXX
(
@P1 nvarchar(255),
@P2 nvarchar(255),
@P3 nvarchar(255),
@P4 nvarchar(255)=null,
@P5 nvarchar(50)=null,
@P6 nvarchar(50)=null,
@P7 nvarchar(50)=null,
@P8 nvarchar(50)=null,
@P9 int=null,
@P10 nvarchar(50)=null
)
as
Begin
 if EXISTS( Select top 1 1 from TABLE_XYZ Where P1=@P1 and P3=@P3)
  Begin
   update TABLE_XYZ 
   set 
   P2=@P2
   ,P4=@P4
   ,P5=@P5
   ,P10=@P10
   ,P6=@P6
   ,P7=@P7
   ,P8=@P8
   ,P9=@P9 
   Where P1=@P1 and P3=@P3
  End
 Else
  Begin
   insert into TABLE_XYZ(ID,P1,P2,P4,P3,P5,P10,P6,P7,P8,P9)
   values(newid(),@P1,@P2,@P4,@P3,@P5,@P10,@P6,@P7,@P8,@P9)
  End
End


How can I improve the performance.

Thanks,
Debata
Posted

Already answered a similar question few months ago, see that

Where to store 1 billion records[^]
Sql Bulk Insert/Update[^]
Running millions of inserts.[^]
ADO.NET and OPENXML to Perform Bulk Database Operations[^]
You can find more in CP & Google
 
Share this answer
 
v2
Comments
Dylan Morley 5-Aug-11 6:26am    
Yep, bulk operations will speed this up
Sergey Alexandrovich Kryukov 5-Aug-11 6:54am    
Ha, this is quite a bunch, my 5.
--SA
0) You haven't told us HOW you're calling the stored proc from your service. However, if you're using the SqlConnection/SqlCommand method, you can improve performance by only creating/opening the connection and command objects one time for each update/insert cycle.

C#
// allocate your objects
SqlConnection conn = new SqlConnection(connstring);
SqlCommand cmd = new SqlCommand("storedproc_name", conn);
// open the connection
conn.Open();
// add your parameters (specifying the type of the parameter
cmd.Parameters.Add("@P2", SqlDbInt); 
...
cmd.Parameters.Add("@P9", SqlWhateverType);

// loop through all your updates
for (int i = 0; i < list.Count; i++)
{
    // set the paramters
    cmd.Parameters["@P2"].Value = list[i].P2Value;
    ...
    cmd.Parameters["@P9"].Value = list[i].P9Value;
    // execute non-query
    cmd.ExecuteNonQuery(); 
}

1) Put groups of records into XML strings and pass them in larger groups to the store proc
 
Share this answer
 
Comments
NDebata 5-Aug-11 6:14am    
Hi John,
I am opening the connection one time only
As posted by thatraja, you can massively speed this up by using some sort of bulk operation

However, if that is not possible for whatever reason, a possible bottle neck in your SQL is this line.

Select top 1 1 from TABLE_XYZ Where P1=@P1 and P3=@P3


How is the object TABLE_XYZ indexed? Do you have an index on P1 and P3? Is there no way you can feed through the unique identifier into your stored procedure so you can select like...

Select top 1 1 from TABLE_XYZ Where ID=@UniqueID


What is your clustered index on the table? UniqueIdentifiers aren't very good candidates for clustered index fields, they will likely cause data reorganisation during inserts because of their random nature. A clusted index should be based on some sort of incrementing identifier.

http://msdn.microsoft.com/en-us/library/ms190639.aspx[^]

Looking at the data types for your SQL parameters, I think you need to analyse the table and query plan, you can probably optimise the process here. Run Query -> Display Estimated Execution plan. Are you causing table scans?
 
Share this answer
 
Comments
NDebata 7-Aug-11 22:28pm    
Thanks a lot Dylan.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900