The new
operator (or for PODs, malloc/calloc) support a simple and efficient form of failing when allocating large chunks of memory.
Say we have this:
const size_t sz = GetPotentiallyLargeBufferSize(); // 1M - 1000M
T* p = new (nothrow) T[sz];
if(!p) {
return sorry_not_enough_mem_would_you_like_to_try_again;
}
...
Is there any such construct for the std::containers, or will I always have to handle an (expected!!) exception with std::vector
and friends?
Would there maybe be a way to write a custom allocator that preallocates the memory and then pass this custom allocator to the vector, so that as long as the vector does not ask for more memory than you put into the allocator beforehand, it will not fail?
Afterthought: What really would be needed were a member function bool std::vector::reserve(std::nothrow) {...}
in addition to the normal reserve function. But since that would only make sense if allocators were extended too to allow for nothrow allocation, it just won't happen. Seems (nothrow) new is good for something after all :-)
Edit: As to why I'm even asking this:
I thought of this question while debugging (1st chance / 2nd chance exception handling of the debugger): If I've set my debugger to 1st-chance catch any bad_alloc because I'm testing for low-memory conditions, it would be annoying if it also caught those bad_alloc exceptions that are already well-expected and handled in the code. It wasn't/isn't a really big problem but it just occurred to me that the sermon goes that exceptions are for exceptional circumstances, and something I already expect to happen every odd call in the code is not exceptional.
If new (nothrow)
has it's legitimate uses, the a vector-nothrow-reserve also would have.