The maximum array size is dependent on the data you store (and the integers available to index them).
So on a 32bit system, you can only index 232 elements at most if you're lucky, which is a bit above 10?. On a 64bit system, you can index 2?? elements, which is a bit above 101?. This is essentially the maximum array size. Being able to index that does not imply that you can also actually get that much from the operating system, as the actual virtual address space might be much smaller. On Linux, a virtual adress space of approx. 64 Terabytes is available per process on 64bit, which are 2?2 bytes.
However, if you actually try to allocate this, you need that much amount of bytes! So if you try to allocate an array of long int
which will probably be 64bits of size, you need 8 Gigabytes of memory.
On a 32bit system, this is impossible. On a 64bit system, you need to have that amount of ram and swap space to work.
If you're on a 32bit system or on a 64bit system without enough memory, you'll get a out of memory error, which is probably the reason for the behaviour you see.
If you also try to create the array statically in a .data section of your executable, the executable may end up with being 8 GBytes large, where you could run into filesystem limits (fat32 anyone?). Also the compiler probably chokes on the amount of data (on 32bit, it'll probably crash).
If you're allocating on stack (this is, as a statically sized local variable array), you'll also run into stack limits on certain operating systems.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…