Click here to Skip to main content
15,887,821 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
See more:
Hello ,

In my application when i try to fill dataset's table with huge amount of records (2 Lakh Approx.) it gives error of outofmemory Exception .

I think it is Due to increasing Virtual memory size (Commit size is increasing in Task Manager) .

So how to prevent this memory link ?

Thanks in Advance .

Regards,
Yogesh

[edit]Lakh - Wikipedia, the free encyclopedia: A lakh or lac is a unit in the Indian numbering system equal to one hundred thousand -OriginalGriff[/edit]
Posted
Updated 5-May-11 4:04am
v2
Comments
Patrick Kalkman 5-May-11 10:09am    
Why do would you want to load an large amount of records in a data table? If you are going to filter it, you should use the database query engine. If you intent to show it to a user, implement paging using the SQL server.

Do you really need all those records at once?
Would it not be better to page them, or filter them at source rather than trying to transfer that many records in one go?

Think about it: if you are filling up your PC memory trying to stuff that much data into a DataTable, think how much bandwidth you will be using on the network link to the server. Now imagine there are 10 of you, trying to do the same thing. At the same time. Or imagine when your boss says: "Right, move that database onto the Web..."
 
Share this answer
 
Comments
NuttingCDEF 5-May-11 11:48am    
Agree entirely - my 5.

It's like spreadsheets that don't scale well because you load the entire thing into memory at once - whereas databases are adept at loading only what you need at that instant / processing large volumes of data one chunk at a time rather than in one huge block.
try.test.abc 6-May-11 5:49am    
Hello ,
Thanks For Answers. But I have already optimized Query with Required Fields
& Required Where condition so it required 50000- 2 Lakh Records in dataset at a time .

I am working on Windows Application & in that for Crystal Report , i hv used
dataset . to fill data for report in Dataset , it gives memory error .


What i can do ?
Also except for above problem , due to heavy database of my s/w in Client DB ,
I have to fetch atleast 25000 - 1 lakh records everywhere out of 6-7 Lakh Records.


So Is there any way that we can reduce Virtual memory ??

Regards,
Yogesh
S/W Developer
NuttingCDEF 6-May-11 6:08am    
If you need to process 200,000 records at a time - and the size of individual records is such that that number of records takes more memory (physical or virtual) than you have available, you only have 2 options:

1. Add more memory to your hardware / increase the size of the page file (though things may still run slowly with a lot of disk activity as you may end up thrashing the page file), or

2. Redesign your code so that you do not have to hold as many records in memory simultaneously.

Of these, 2. is likely to be preferable. With 1., you will almost certainly hit problems at some point if the database grows / number of records increases - performance will suffer as you have to make more use of the page file and eventually you hit a point where you are back to the out of memory problems.

To clarify one thing - I thought 1 lakh was 100,000 - your comments suggest you are using 1 lakh as 25,000. Can you confirm what you mean by 1 lakh?

You have already stated that you are fetching 1 lakh records from a database of 6-7 lakh. So you are already only fetching a proportion. Can you not subdivide this further? If so, that would almost certainly help.

You have also (I think - but you might want to clarify this) said that due to heavy use of the database you already have to fetch at least 1 lakh of records. This feels wrong - if a databse is large / heavily used surely you want to structure code so that you perform more modest sized queries rather than a few huge ones - the server itself may struggle more if it is having to handle big queries that tie up a lot of server resources while they are running.

In that context you might also consider whether you have appropriate indices / relationships defined in the database so that WHERE / ORDERBY clauses in queries are more efficient and so facilitate optimisation of modest sized queries to improve speed / reduce server load.

I'm not an expert on them, but would be surprised if using Crystal Reports changed any of this very much.

If you need further help, you might usefully post the relevant code so that someone can suggest ways to improve / restructure it.

Hope this helps.
OriginalGriff 6-May-11 6:04am    
Jeremy has given you some suggestions in his answer, which may help a little bit, but the only real solution that is guaranteed to work in the field when you don't know how many records will actually be involved is to reduce the number of records you retrieve at a time.
There are a few methods which allow you to page the number of records you retrieve (depending on your datasource) - look into them. It is more coding for you, but it will give a smoother result to the user, and better still, it means your application should actually work!

Sorry, but sometimes you have to submit to the real world conditions and do the job the hard way.
I'd suggest not loading this much data in the 1st place if it can be avoided. Is there any reason you can't page or break up the data somehow?

The amount of available memory to a process is complicated, but it may be as low as 1.4GB depending upon your setup. Your records would only need to be about 8K to hit this limit (assuming I've done my calcs correctly) so you may need to consider better memory management.
 
Share this answer
 
First, you're pulling a pile of data back and storing it in memory, so this isn't really a memory leak, you're just using all the available memory.

Here are some suggestions for working around the error:
1. Reduce the number of rows you are bringing back. Chances are you only need a small subset of the 200k records you're bringing back.

2. Reduce the number of columns you are bringing back. Are you doing a select * from table1 or are you only pulling back the columns you'll need?

3. Use a DataReader to populate custom data objects instead of using the DataTable. DataTables come with a lot of overhead.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900