Click here to Skip to main content
15,900,511 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hi,
I am using c# windows application and appending a text file while a datareader is looping. Consider the situations-----------------
In an excel file there is a broadcast,so data is being updated. Now I want to save the data in text file. I know some data will be missed due to file reading will take some time. Thats why I want to know efficient way.
The excel is of >=500 rows.
now I have created a select query to that excel and in datareader the data is fetched like-
private void readWriteData()
{
oledbcommand cmd=new oledbcommand(query);
oledbdatareader dr=cmd.executeReader();
while(dr.Read())
{

}
dr.close();
}

readWriteData() is being called over and over(after its finished then again its called) by background worker. So the more fast it will be called the more data changes (ticks) I can capture.

output in text is as---
1-50-100
2-30-200
3-70-50
.
.
.
1-30-100
2-25-300
3-71-49
.
.

1,2,3 is code of symbol and the 30,100 s are data and those are fluctuating.
So after reading and writing 1 to 500 then again program will start reading from 1.

1)I can do it like this-----------
C#
<pre lang="C#">
while(dr.Read())
{
 using (FileStream fs = new FileStream(fileName,FileMode.Append, FileAccess.Write))
 using (StreamWriter sw = new StreamWriter(fs))
 {
    sw.WriteLine(dr[0].toString()+"-"+dr[1].toString());
 }
} 


2)Or like this-------------

C#
<pre lang="C#">

string s="";
while(dr.Read())
{
 s+=dr[0].toString()+"-"+dr[1].toString()+Environment.NewLine;
} 
dr.close();
File.WriteAllText(path, s);
</pre>



Please help what is best in my scenario. Even you can suggest new thing what I can do. Just mind it my target is saving a broadcast data from a large excel to text file. So the more ticks (data up and down) I can capture,the more it is fine.
Posted

1 solution

Try just:
C#
File.AppendAllText(filename, string.Format("{0}-{1}\n", dr[0], dr[1]));

But...I don;t know that it will work as you have it, since the Read command will end when it runs out of rows to feed you - even if more will be arriving at the source soon. Additionally, I'm not sure that the use of a reader won't "lock" the Excel file preventing the updates from writing to it - which could be why you lose data.
I'd probably use a DataAdapter and a DataTable to read the whole thing as fast as possible in a single operation, then process that into your text file before repeating the operation.
 
Share this answer
 
Comments
souvikcode 9-Dec-15 4:14am    
1)yes you are right if more rows come to source then what will occur. But row is fixed.If I set 1000 then it will be 1000.
2)the excel file itself created by odin and receiving NSE broadcast.Its updating data to excel by macro,I'm not updating manually.And I have tested that in a second 500 rows are being read and its not locking the excel.

I want the suggestion what may be different from my code. If you tell datatable is fast,I'll also try it. You mean fetch all data to datatable by adapter then read through it by loop,right? But I hope datareader is faster that that operation.May be I'm wrong.I have to use your suggestion also.

Thanks.
souvikcode 9-Dec-15 4:17am    
What you want?Appendtext() should be called in each loop datareader is doing or what I have done(after creating a string then write text at a time)?

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900