The paper's next move is to link 3 proposals from 2022, 2021, and 2015 about how to use C++ for type and resource safety, and then the C++ Core Guidelines themselves.
Speed-summary.
2022
The 2022 paper lays out a bunch of alternatives and then goes all-in on the Core Guidelines as THE solution. "Just develop the checkers more, slap then on top of existing code and keep using the compatibility story to push what C++ is in more places". It's built around this idea.
It mentions other ways of maintaining type safety, but it boils down to "use the RAII we have and combine it with lots of core guidelines to get what we want". I mean, RAII is wonderful, and that's fair, but you know. Again.
Existence proof of Google's work makes it a little less palatable to think this'll do the trick completely.
2021
This one very explicitly tries to lay out some rules for protecting programmers. But, a lot of the advice is simply just straight up unactionable for the way C++ is written today, meaning it's likely geared towards people doing things today to write new code without lots of refactoring. For example, Rule #2 from this paper is
Every object is accessed according to the type with which it was defined.The language guarantees this except for preventable pointer misuses (see §4, §5), explicit casts, and type punning using unions. The CG have specific rules to enforce these language rules.
Static analysis can prevent unsafe casting and unsafe uses of unions. Type-safe alternatives to unions, such as std::variant, are available. Casting is essential only for converting untyped data (bytes) into
typed objects.
This is a pretty sensible rule until you start doing things in more complicated code bases or interfacing with C, where the name of the game is committing crimes with pointers -- void*
or otherwise -- to get things done. Type-punning is common, and std::variant
only showed up in C++17 (much like std::string_view
also showing up 10 years late by being introduced in C++17, also with a hamstringed std::optional
type -- see here for my usual rant on that).
It also flies in the face of the whole "we take advantage of code that exists", because doing that means you need to interface with C++03, C++11, and C89 to C17 code. All of that is raw pointers and crimes, especially since until C++11 there was no move semantics, so if the whole point -- as these papers and the overarching paper points out -- is to keep compat alive and use old code, then you're going to have to keep working with a grafted up monster.
2015
This is actually the best paper because it uses lots of examples to explain some serious problems with C and C++ code. It walks people through introducing RAII types for safety to prevent use-after-free, double-free, and other large classes of problems. It still shills for the Core Guidelines eventually, but at least it's trying to help. But, again, it offers it in the context if migrating old code to new styles and idioms; but, at the same time, the tension of "you can keep using your old code" keeps rising up. So either we're all using old code or we're rewriting it to use the new stuff, which means that for all the compatibility we have we're still rewriting large chunks of code to defend against bugs instead of just writing the code we want to write to make progress.
This is a consistent theme for this paper and many C++ papers; "compatibility is important", is what they keep saying, while deeply implying that old code needs new tools to be rewritten in and transition paths must happen. At some point, I need to stop fucking with Old and Busted But Occasionally Reliable and start working on the New Hotness, so we end up with a Lovecraftian monstrosity as a matter-of-fact in most long-lived C++ codebases.
Damned if you do, damned if you don't I guess?

… Let’s talk about something that has haunted me for over a year now.
The Pasture