As the author of the slides I'll try to clarify.
If you write code explicitly allocating a Derived
instance with new
and destroying it with delete
using a base class pointer then you need to define a virtual
destructor, otherwise you end up with incompletely destroying the Derived
instance. However, I recommend to abstain from new
and delete
completely and use exclusively shared_ptr
for referring to heap-allocated polymorphic objects, like
shared_ptr<Base> pb=make_shared<Derived>();
This way, the shared pointer keeps track of the original destructor to be used, even if shared_ptr<Base>
is used to represent it. Once, the last referring shared_ptr
goes out of scope or is reset, ~Derived()
will be called and the memory released. Therefore, you don't need to make ~Base()
virtual.
unique_ptr<Base>
and make_unique<Derived>
do not provide this feature, because they don't provide the mechanics of shared_ptr
with respect to the deleter, because unique pointer is much simpler and aims for the lowest overhead and thus is not storing the extra function pointer needed for the deleter. With unique_ptr
the deleter function is part of the type and thus a uniqe_ptr with a deleter referring to ~Derived
would not be compatible with a unique_ptr<Base>
using the default deleter, which would be wrong for a derived instance anyway, if ~Base
wasn't virtual.
The individual suggestions I make, are meant to be easy to follow and followed all together. They try to produce simpler code, by letting all resource management be done by library components and the compiler generated code.
Defining a (virtual) destructor in a class, will prohibit a compiler-provided move constructor/assignment operator and might prohibit also a compiler provided copy constructor/assignment operator in future versions of C++. Resurrecting them has become easy with =default
, but still looks like a lot of boilerplate code. And the best code is the code you don't have to write, because it can not be wrong (I know there are still exceptions to that rule).
To summarize "Don't define a (virtual) destructor" as a corollary to my "Rule of Zero":
Whenever you design a polymorphic (OO) class hierarchy in modern C++ and want/need to allocate its instances on the heap and access them through a base class pointer use make_shared<Derived>()
to instantiate them and shared_ptr<Base>
to keep them around. This allows you to keep the "Rule of Zero".
This doesn't mean you must allocate all polymorphic objects on the heap. For example, defining a function taking a (Base&)
as parameter, can be called with a local Derived
variable without problems and will behave polymorphic, with respect to virtual member functions of Base
.
In my opinion dynamic OO polymorphism is heavily overused in many systems. We shouldn't program like Java, when we use C++, unless we have a problem, where dynamic polymorphism with heap allocated objects is the right solution.