Click here to Skip to main content
15,887,477 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
My program is written in regular C in Visual Studio 2017 and runs on Windows 10.

It reads binary files into memory and extracts out information. If I process a large number (over 100) of really large files (50-75MB) I get a HeapAlloc error. I'm doing a HeapAlloc, processing the file, and then doing HeapFree. For some reason the heap keeps increasing in size until it hits the limit for 32bit applications.

I was under the impression that if I did a HeapFree, the next HeapAlloc would reuse the memory already used by the heap. It appears that isn't the case.

What should I be doing differently?

What I have tried:

C++
// do this once at the start
hHeap = GetProcessHeap();

// the following once per file

hFile = CreateFile(lpszLibraryName,GENERIC_READ, FILE_SHARE_READ, NULL,						 
                    OPEN_EXISTING, 0, NULL);

dwFileSize = GetFileSize(hFile, &dwSizeHigh);

lpHeapData = HeapAlloc(hHeap, HEAP_ZERO_MEMORY, dwFileSize + 1);

ReadFile(hFile, lpHeapData, dwFileSize, &dwBytesRead, NULL);

... process the file data ...

HeapFree(hHeap, 0, lpHeapData);

CloseHandle(hFile);
Posted
Updated 26-Feb-19 11:15am
Comments
Richard MacCutchan 27-Feb-19 4:16am    
Is there any possibility that you are corrupting the heap somehow when you process the file data? BTW, why are you asking HeapAlloc to zero the memory, when you immediately overwrite it?

You should check the result from HeapFree() and LastError() if you want to get a better idea of what's going on.

You're also "closing the file" (that accesses the heap) AFTER freeing the heap when you shouldn't be "touching" the heap anymore.

HeapFree function | Microsoft Docs[^]
 
Share this answer
 
Memory allocated by HeapAlloc is not movable. Since the memory is not movable, it is possible for the heap to become fragmented.
You need to get the largest file size out of 100 files and allocate memory only once. You will then reuse it for all 100 files and deallocate it once
 
Share this answer
 
v3
Comments
Rick York 26-Feb-19 18:00pm    
Good idea. Another possibility would be to keep a running maximum and only free/allocate when that maximum is exceeded. The maximum could be initialized with a very big value to minimize the reallocations.
Richard MacCutchan 27-Feb-19 3:48am    
But that is what the HeapReAlloc function is for.
steveb 27-Feb-19 9:01am    
I think what he was doing is calling HeapAlloc and HeapFree 100 times, which will fragment the heap. What I suggested that he does not need to allocate and free memory 100 times. Instead allocate memory only once large enough fit the largest file and then use that same memory to process all 100 files. HeapAlloc is notorious for fragmentation
Richard MacCutchan 27-Feb-19 9:38am    
I have just run some tests allocating and freeing random buffers (between 5MB and 500MB) and it works fine.
Roland M Smith 27-Feb-19 10:33am    
I am allocating enough space to hold the largest file, processing all the files, and then freeing it once at the end. This works great!

I was processing about 160 files with a size range of 50 KB up to 77 MB.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900