One of the biggest programming problems, related mostly (but not exclusively) to OOP is generalization. There are numerous running jokes on Twitter, like foo function that is 100% future-proof:
template <typename T, typename T1> virtual T foo(T1 t, ...) = 0;
There are many failful designs out there that suffer greatly from generalization. Fine example is a 3D renderer I’ve seen once. It was written by a pro-OOP guy, and it was heavily templated, STLed and smart pointed. There were many [C]Material classes and various [C]Meshes and everything was tightly design patterned. So I’ve asked the author to add simple vertex extrusion along its normal. He instantly refused, pointing that he would have to add many additional classes for extended materials, shaders and parameters. Objects properties and internal tools would also require changes. He said it would take about 10 to 20 man-hours. I said Wait, it’s one line in shader and one additional parameter — why would it take so long? And he replied: My engine is designed to be easily extended, so such modifications require a lot of work.
This can be described as vicious circle of generalization:
- We want the code to be as generalized as possible.
- We design everything future-proof and extendible.
- When a feature request arrives, we’re doomed we need to change a lot of code.
- Why?
- Because everything was designed as generalized as possible.
- goto 1;
And visual version:
So — at least in my opinion — reject toxic design patterns and KISS instead. Small functional changes should not require big code changes. And everytime you over-engineer something, another startup dies (and there aren’t many left after all singleton experiments).
IGK presentation [PL only, unfortunately] about efficient C++ and DOD from Adam Sawicki and me.