"Obvious but wrong. As Hungarian-born mathematician Abraham Wald explained at the time, if a plane makes it back safely even though it has, say, a bunch of bullet holes in its wings, it means that bullet holes in the wings aren't very dangerous. What you really want to do is armor up the areas that, on average, don't have any bullet holes. Why? Because planes with bullet holes in those places never made it back. That's why you don't see any bullet holes there on the ones that do return." (From http://www.motherjones.com/kevin-drum/2010/09/counterintuitive-world)
When considering software practice and process I wonder if we make the same mistake, looking to armor the area with the most bullet holes. Every time you have to maintain some buggy annoying and difficult piece of software, you are patching up one of the planes that got home, that made it through. No one maintains the software that never made it through battle. The projects that flew on until they ran out of fuel, and due to navigator error never turned around to head home to base. There's lots of failure modes. Most of them aren't contingent on the quality of the software engineering.