Click here to Skip to main content
15,887,135 members
Please Sign up or sign in to vote.
1.00/5 (2 votes)
See more:
C++
#include "stdafx.h"
#include <iostream>
#include <vector>
#include "conio.h"
int main ()
{
	typedef struct
	{
		int iArr[10];
	}BIGSTRUCT;

	BIGSTRUCT st;
  std::vector<BIGSTRUCT> myvector;
  int ii = sizeof(BIGSTRUCT);
  std::cout << "max_size: " << (int) myvector.max_size() << '\n';
  _getch();
  return 0;
}


I am using following code to get maximum vector I can allocate; If the size of iArr is 1, then I am getting it 1,073,741,823; But if I increase the size to 10, it is 1,073,741,82;
http://stackoverflow.com/questions/4321040/c-vectors-of-objects-and-pointers?rq=1[^]
http://stackoverflow.com/questions/3813124/c-vector-max-size[^]
I have too many confusions here:
1) how this number is coming? Ho wit is calculated?

Now, if I am on a 32bit machine, with 2GB RAM, and if I have a vector of size say 170,000,000 members, what is the maximum size of each member do I have to have so I will not go out of memory?

Thanks
Posted

Now, if I am on a 32bit machine, with 2GB RAM, and if I have a vector of size say 170,000,000 members, what is the maximum size of each member do I have to have so I will not go out of memory?

First, the physical amount of RAM has nothing to do with it: a modern OS will shuffle your memory contents around as needed. E. g. on Windows, that's what the pagefile is for!

That said, the maximum amount does depend on both your application, and your OS: in theory, you can address a total of 4 GB on a 32 bit machine. However, that address space covers every object of your entire application! So if you've got a lot of other spacious objects lying around, this will limit the amount of space available for your array accordingly! Second, your OS may reserve part of your address space: e. g. Windows likes to use the address space above ~3 GB for system functions (this may in fact vary depending on the version of Windows you are using). Therefore your address space may be considerably smaller than you think.

A word of advice: you should never strive to use up the maximum amount of memory available! If nothing else, you're severely hampering every other application that is running on the same machine because they're forced to constantly move memory contents around!

If you need to store that much data in memory, then you should really try to redesign your algorithm and make it use less memory! The usual way for working with big data is to use a database! They let you load all the data necessary to work on one record individually. Or you can just read the data record by record.
 
Share this answer
 
v2
Comments
nv3 22-Apr-13 17:10pm    
Very good answer, Stefan. My 5. Except for the statement that each thread gets its own address space. Each thread gets its own stack (which is part of the 2GB or 3GB total address space). Other than that, all threads of a process share the same memory. That's in fact the whole point about threads.
Stefan_Lang 23-Apr-13 2:41am    
Thanks. I removed that advice.
The calculation is based on the formula (for your case):
C++
max_size = (size_t)(-1) / sizeof(BIGSTRUCT);

For an array with 1 int (4 bytes) this yields 1,073,741,823.
For an array with 10 int (40 bytes) this yields 107,374,182.

Unfortunately it is not based on your actual machine configuration.
 
Share this answer
 
v3
When using the stl, you can never pre-calculate the exact maximum size of a collection of items. The best you can do is approximate. What follows is an estimate I make here and now, with a few unstated assumptions. If you ask 2 other people, you might get 5 or more different answers. If you ask me again tomorrow I might have a different estimate.

For std::vector, memory consumed for each element will be a pointer to the struct, plus an instance of the struct, which I assume is on the heap, which means that there will be memory management overhead.

On recent Windows machines:
For debug builds, this will be a minimum of sizeof(void*) + (sizeof(BIGSTRUCT) + 16) where the last item is rounded up to whatever the memory management "chunk size" is.
For release builds, this will be a minimum of sizeof(void*) + (sizeof(BIGSTRUCT) + 8) where the last item is rounded up to whatever the memory management "chunk size" is.

Both of the rounded values might be different (haven't checked lately).

The st::vector array of pointers will likely be contiguous (or if not, then close to it). The structs in the heap will not likely be contiguous (and will depend on how you create them).
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900