Click here to Skip to main content
15,887,683 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
Hello,

I need to pass a .NET Collection to a stored procedure. My collection contains 1000 records. Each record contains 2-3 child objects of other types.

All I need to insert them at once by calling stored procedure from .NET.

I already have tried a TVP (table-valued parameter), but that simply doesn't fulfills my needs.

For each and every root level record, I need to call stored procedure by passing root record and it's corresponding child records. This is making the whole process very slow.

That's why I am looking a way, so that I can pass whole hierarchical list at once, and inside stored procedure, by looping and iterating child records I can insert parent child records.

How can I achieve this?


What I have tried:

Just a stored procedure taking two TVP , one for parent and other for it's related children
Posted
Updated 24-Dec-16 10:15am

So, what you are saying is that you have structured data that consists of parent and child records, and while the parent schema is static, the children have differing schemas.

Is your intent to shred the data and span it across multiple tables in the database using a single stored procedure? The only reliable way to do this in a single stored procedure would be to serialize the data to either XML or JSON (if you are working with Sql Server 2016) and have the stored procedure shred and load the data to the various tables.
 
Share this answer
 
I've never had performance issues with sql server that were not caused by something I've done. I don't use table valued parameters.

You can either iterate the list and save the onbjects one at a time, or do like the first answer suggested, and serialize all the data into xml and pass the XML to the stored proc. I prefer separate calls myself, but that's just me.
 
Share this answer
 
Comments
glitteringsound 24-Dec-16 14:54pm    
Previously I used entity framework for saving 100K rocords in database where each recod further contains child records.

my object hierarchy is :

public class Consignment

{

....

List<consignmentaddress> addresses {get; set;}

List<consignmentline> lines { get; set; }

}

I could expect Consignments Count = 100K and each Consignment could contain 2 ConsignmentLines and on average 3 Consignment Addresses.

So this all makes roughly = 100K + 200K + 300K = 600K records .

For inserting all these 600K records, I used entity framework and this was damn slow. Then I devided the file in chunks and parse 1000 records and inserted using EF by recreating context each time. This was very improved . But still that does not full fill my requirement ( as in future i could expect 500K root level records and each further could contain 3-4 child record in each list. That could make 1200K records in database).

Now I need fast and efficient solution.

That's why I want to send the whole hierarchy at once, so that i can iterate that collection inside SQL and insert the parent child records. As this would be close to metal , it would be 10 times faster than entity framework.
But at the moment, I am iterating this List<consignment> collection from .NET C# in loop and for each consignment, along with it's children ( i.e List<consignmentaddress>, List<consignmentline> ), I am calling stored procedure and sending these 3 as TVP ( 1 for Consignment, 2nd for List<consignmentaddress>, 3rd for List<consignmentline> ). This procedure is also very slow. Because, for 100K root records, 100K stored procedure calls are going.

Now I need a solution, which takes the whole hierarchy or at least some chunk and push all the data at once by keeping whole parent child relationship intact.
Previously I used entity framework for saving 100K rocords in database where each recod further contains child records.

my object hierarchy is :

public class Consignment

{

....

List<consignmentaddress> addresses {get; set;}

List<consignmentline> lines { get; set; }

}

I could expect Consignments Count = 100K and each Consignment could contain 2 ConsignmentLines and on average 3 Consignment Addresses.

So this all makes roughly = 100K + 200K + 300K = 600K records .

For inserting all these 600K records, I used entity framework and this was damn slow. Then I devided the file in chunks and parse 1000 records and inserted using EF by recreating context each time. This was very improved . But still that does not full fill my requirement ( as in future i could expect 500K root level records and each further could contain 3-4 child record in each list. That could make 1200K records in database).

Now I need fast and efficient solution.

That's why I want to send the whole hierarchy at once, so that i can iterate that collection inside SQL and insert the parent child records. As this would be close to metal , it would be 10 times faster than entity framework.
But at the moment, I am iterating this List<consignment> collection from .NET C# in loop and for each consignment, along with it's children ( i.e List<consignmentaddress>, List<consignmentline> ), I am calling stored procedure and sending these 3 as TVP ( 1 for Consignment, 2nd for List<consignmentaddress>, 3rd for List<consignmentline> ). This procedure is also very slow. Because, for 100K root records, 100K stored procedure calls are going.

Now I need a solution, which takes the whole hierarchy or at least some chunk and push all the data at once by keeping whole parent child relationship intact.
 
Share this answer
 
There is another option you might want to consider, and that is adding and populating your referential integrity fields (Id, ParentId, etc.) in your POCO (or a new POCO just for the insertion process) and then performing set-based inserts for each object type. You would use the "SelectMany" LINQ syntax to combine multiple sub IEnumerable sets together.

What slows down the insert process in EF is that you are doing parent1-child1-child2, parent2-child3-child4, etc., and Sql Server does not pick up on optimizing the inserts.

If you were to insert in set-based operations, you would insert all the parent rows, then all the first child rows, and then all the second child rows, maintaining referential integrity while SQL Server would optimize the inserts.
 
Share this answer
 
Comments
glitteringsound 24-Dec-16 16:15pm    
Hwo would SQL server will knwo what child belongs to which parent?
and how many children are there for specific parent in this case, If I save parent and children seperately. For example Consignment1 has 4 ConsignmentLines, 3 ConsignmentAddresses, while other consignment has 2 Consignmentlines and 5 ConsignmentAddresses
How would sql server knows which child belongs to which parent?
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900