No good reason as far as I know.
Someone just proposed a change to this a month ago. I encourage you to support it.
std::string
is not the best example of well done standardization. The version initially standardized was impossible to implement; the requirements placed on it where not consistent with each other.
At some point that inconsistency was fixed.
In c++11 the rules where changed that prevent COW (copy on write) implementations, which broke the ABI of existing reasonably compliant std::string
s. This change may have been the point where the inconsistency was fixed, I do not recall.
Its API is different than the rest of std
's container because it didn't come from the same pre-std
STL.
Treating this legacy behavior of std::string
as some kind of reasoned decision that takes into account performance costs is not realistic. If any such testing was done, it was 20+ years ago on a non-standard compliant std::string
(because none could exist, the standard was inconsistent).
It continues to be UB on passing (char const*)0
and nullptr
due to inertia, and will continue to do so until someone makes a proposal and demonstrates that the cost is tiny while the benefit is not.
Constructing a std::string
from a literal char const[N]
is already a low performance solution; you already have the size of the string at compile time and you drop it on the ground and then at runtime walk the buffer to find the ''
character (unless optimized around; and if so, the null check is equally optimizable). The high performance solution involves knowing the length and telling std::string
about it instead of copying from a ''
terminated buffer.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…