Click here to Skip to main content
15,887,027 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more: , +
Hi,
I am running about about 1,000,000records to input it into a database using stored procedures. The problem is that inserting takes hours... even and i am creating a new sql connection object and closing the connection after each insert. This data uses stored procedure and can be of different tables.....
Posted

Rather than running the code in a SqlCommand type of process, you should really consider delegating this to an external process, and run your data in with bcp[^] (the SQL Server Bulk Copy process).
 
Share this answer
 
Comments
fjdiewornncalwe 5-May-11 9:15am    
+5. This would be my choice.
Pete O'Hanlon 5-May-11 9:40am    
Cheers.
Ed Nutting 5-May-11 12:24pm    
+5 Seems like the best option :)
can you possibly do this using BI tools like SSIS and dont create a connection for each SP call. Keep the connection open until you are done. Hope this helps
 
Share this answer
 
Comments
NuttingCDEF 4-May-11 18:02pm    
"Keep the connection open" - agree wholeheartedly! My 5.
Ashishmau 5-May-11 0:47am    
Agreed my5
1. Why start a new connection for every insert? Yes, close / reopen connections to avoid having lots of idle connecetions open to your database server - but not when you know that you have a lot of work to do and no likelihood of the connection being idle.

2. Are you forced to use a stored procedure? If not, why not use Linq and let it handle the mechanics of SQL - add / submit records in batches (1,000?).

3. If you have to use a stored procedure, can you rewrite it to accept batches of records?

Any batch / bulk updates are likely to be more efficient than large numbers of single insertions - fewer round trips to / from the server - less overhead associated with multiple small queries.

4. How big are your records / how much data has to be transferred / what sort of connectivity do you have? e.g. could the time be partly / largely down to how long it takes to transfer that much data over the internet? If so, you need to focus on reducing the volume of data, not how your INSERT queries etc. work.

5. What indices do you have defined on your table? Updating indices can slow down updates considerably so don't have more indices than you require - especially not a clustered index unless you really need it! Similarly relationships - what is actually necessary? Can you redesign your database to reduce any of these overheads?

Hope some of this helps.
 
Share this answer
 
You don't have to use a stored procedure. You can use a dataset/datatable in C#. Insert several datarows into the dataset/datatable and the update the dataset(save to database)
 
Share this answer
 
Comments
Wonde Tadesse 5-May-11 21:02pm    
Perfect.
i just ran 65,000 records with this statistic for execution times:


Minutes :10
Seconds :8
Mili seconds :608108.5923


The data cannot be a data file since it is parsed form a file originally and based on the current line value a different insert is required. It is not just one type.


This is for a CDR Application.




Thanks..
 
Share this answer
 
Comments
Ed Nutting 5-May-11 12:23pm    
Again, should be a comment.
The records are call detailed records... I will try and check for a batch updating and see if it works.... thanks.... I will research into writing a batch update and see if it works. thanks guys..
 
Share this answer
 
Comments
Ed Nutting 5-May-11 12:22pm    
This should be a comment.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900