Click here to Skip to main content
15,882,055 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hi,

I want to load 2 GB xml into sql server using SqlBulkCopy. The execution is getting hanged without any result.
Can we read specific nodes or filter the reading of xml-
C#
ds.ReadXml(Server.MapPath("~/Data.xml"))
Is it going to work in this case.

Is there any other option to load these huge xml to DB.

What I have tried:

My Code:

C#
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Data;
using System.Data.SqlClient;
using System.Configuration;


namespace WebApplication1
{
    public partial class WebForm1 : System.Web.UI.Page
    {
        protected void Page_Load(object sender, EventArgs e)
        {

        }

        protected void Unnamed1_Click(object sender, EventArgs e)
        {
            string cs = ConfigurationManager.ConnectionStrings["CS"].ConnectionString;

            using (SqlConnection con = new SqlConnection(cs))
            {
                DataSet ds = new DataSet();

                ds.ReadXml(Server.MapPath("~/Data.xml"));


                DataTable mOb = ds.Tables["managedObject"];
                DataTable mOb_p = ds.Tables["p"];


                con.Open();


                using (SqlBulkCopy sb = new SqlBulkCopy(con))
                {
                    sb.DestinationTableName = "managedObject";
                    sb.ColumnMappings.Add("managedObject_Id", "managedObject_Id");
                    sb.ColumnMappings.Add("class", "class");
                    sb.ColumnMappings.Add("version", "version");
                    sb.ColumnMappings.Add("distName", "distName");
                    sb.ColumnMappings.Add("id", "id");
                    sb.ColumnMappings.Add("cmData_Id", "cmData_Id");
                    sb.BatchSize = 500000;
                    sb.BulkCopyTimeout = 10800;
                    sb.WriteToServer(mOb);
                }

                using (SqlBulkCopy sb = new SqlBulkCopy(con))
                {
                    sb.DestinationTableName = "p";
                    sb.ColumnMappings.Add("name", "name");
                    sb.ColumnMappings.Add("p_text", "p_text");
                    sb.ColumnMappings.Add("managedObject_Id", "managedObject_Id");
                    sb.BatchSize = 500000;
                    sb.BulkCopyTimeout = 10800;
                    sb.WriteToServer(mOb_p);
                }


            }
        }
    }
}
Posted
Updated 14-Jul-20 23:54pm
v2
Comments
Garth J Lancaster 15-Jul-20 5:29am    
"Can we read specific nodes or filter the reading of xml-" - is that a question ? if so, the answer is apparently not, unfortunately .. it may be the column mappings that are taking time, but I'd have to see it's performance and memory stats really to try and decide

With no details of data.xml, except it seems to have 2 different types of data, what I would try is to reduce the data set size, in a number of steps
1) sufficient data to make sure the process in general works, but maybe 100 records
2) split data.xml into the two types/nodes of data, as individual files - I think this is what you're asking as the first question, but I'd go even further, in that each node in each 'split' by type file only contains as much data as it needs and no more
3) wrap the bulk-copies in a transaction - one per copy ...

[Edit] with a little work, you might also try the WriteToServerAsync method

1 solution

Try reducing the BatchSize to 1000 (500,000) seems way to high to be efficient).

It will be obvious within a second or two if it's going to work. by simply monitoring the row count in the target table.

Another option is to read a certain number (I wouldn't load more that 5000 or so) of records from the file one at a time, and then use SqlBulkCopy to insert them into the database.
 
Share this answer
 
v3

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900