Click here to Skip to main content
15,897,187 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
public static void BulkUpdate(string col, System.IO.StreamReader sr, string tableName)
       {
           ConnStr = ConfigurationSettings.AppSettings["ConnStrAffiliate"].Trim();
           SqlConnection myConn = DAO.fetchConnection();
           //string line = sr.ReadLine();
           string[] value = col.Split(new char[] { '\t' }, StringSplitOptions.RemoveEmptyEntries);
           DataTable dt = new DataTable();
           DataRow row;
           foreach (string dc in value)
           {
               dt.Columns.Add(new DataColumn(dc, typeof(string)));
           }

           while (!sr.EndOfStream)
           {
               value = sr.ReadLine().Split(new char[] { '\t' });
               List<string> valueList = value.ToList();
               valueList.RemoveAt(valueList.Count - 1);
               value = valueList.ToArray();
               if (value.Length == dt.Columns.Count)
               {
                   row = dt.NewRow();
                   row.ItemArray = value;
                   dt.Rows.Add(row);
               }
           }
           SqlBulkCopy bc = new SqlBulkCopy(ConnStr, SqlBulkCopyOptions.TableLock);
           bc.DestinationTableName = tableName;
           bc.BatchSize = dt.Rows.Count;
           bc.WriteToServer(dt);
           bc.Close();
       }
Posted

1 solution

It's difficult to be certain without running that code, but it is generating a lot of small memory items, and probably giving the GC a fair amount of work. Each line of text is generating a string, an array of strings, a list from that array, another array of strings and a DataRow.
However, I suspect this is not the cause of the "hang" if it works with a 100mb file.

Try doing it in stages: say 500 lines - bulk update, another 500 lines - bulk update, and so forth - at the moment the "hang" could be in any part of the application - even in the SQL server code rather than yours.

You are doing this in a separate thread, aren't you?
 
Share this answer
 
Comments
Tirthankar Dutta 11-Feb-12 3:03am    
i am running it in same thread.So any code snip to read first 500 line,then insert to database and delete from memory then again read 500 lines.
Tirthankar Dutta 11-Feb-12 3:04am    
Also computer shows low virtual memory after 10-12 mins
OriginalGriff 11-Feb-12 3:19am    
Yes. That's your problem!

Firstly, move the processing into a background thread - do not under any circumstances do it in the main UI thread as it will hang your PC - or at least appear to.

Second, as I suggested, do it in chunks - you need to free up used memory (via the Garbage Collector) which it can't do if all of your text is still loaded into DataRows that are in use. It may also pay you to start manually Disposing of items when you are finished with them - the arrays, List and DataRows cannot be disposed, but the DataTable can once you have written it to SQL.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900