I've seen the other similar questions and read the defect about it. But I still don't get it. Why is i = ++i + 1
well-defined in C++11 when i = i++ + 1
is not? How does the standard make this well defined?
By my working out, I have the following sequenced before graph (where an arrow represents the sequenced before relationship and everything is a value computation unless otherwise specified):
i = ++i + 1
^
|
assignment (side effect on i)
^ ^
| |
☆i ++i + 1
|| ^
i+=1 |
^ 1
|
★assignment (side effect on i)
^ ^
| |
i 1
I've marked a side effect on i
with a black star and value computation of i
with a white star. These appear to be unsequenced with respect to each other (according to my logic). And the standard says:
If a side effect on a scalar object is unsequenced relative to either another side effect on the same scalar object or a value computation using the value of the same scalar object, the behavior is undefined.
The explanation in the defect report didn't help me understand. What does the lvalue-to-rvalue conversion have to do with anything? What have I gotten wrong?
See Question&Answers more detail:os