Causes for security problems are legion. One of the high pertinence problems in software development is called “over-engineering” – creation of over-complicated design or over-complicated code not justified by the complexity of the task at hand.
Often it comes as a result of the designer’s desire to show off, to demonstrate the knowledge of all possible methods, patterns and tricks. Most of the time it impresses only people who are not deeply involved with the subject. Usually it annoys people that know software design and the subject matter. Slightly more often than always it results in overly complicated code full of security holes.
Of course, over-engineering is nothing new and even in the old times this was a popular thing. The problem is, where the old mechanical contraptions were ridiculous and did not really quite work in the real life, the contemporary computer software, although built in about the same way, manages somehow to struggle through its tasks. The electronics and software design industry are the most impacted by this illness that became an epidemic. Interestingly, this is one aspect where open source software is no different from commercial software – the open source designers also try to impress their peers with design and code. That is how things are.
The over-engineering in software is not just big, it is omnipresent and omnipotent. The over-engineering disease has captured the software industry even more completely than the advertising – the broadcasting. The results are predictably sad and can be seen with an untrained eye. Morbidly obese code is generated by morbidly obese development tools leading to a simply astonishing myriad of bugs in software that fails to perform its most apparent task and written with so much effort that writing the original in assembler would have been faster.
The implications are, of course, clear: the complexity is bad for security and the bugs are bad for security. Mind you, the two interplay in ways that exacerbate the problem. To top it off, the software becomes unmaintainable in a number of ways, including the total impossibility of predicting the impact of even small changes to the overall system.
The correctness of the software implementation is hard to judge on non-trivial amounts of code. Software is one of the most complicated things ever invented by mankind. So when the software gets more complex, we lose oversight and understanding of its inner working. On large systems, it is nowadays usual for no one to have a complete overview of how the system works. Over-engineered software has much more functionality than required, complicating the analysis and increasing the attack space. It tends to have functions that allow complex manipulation of parameters and tons of side effects. Compare making a reaping hook and a combine-harvester from the point of view of not harming any gophers.
Bugs proliferate in over-engineered software both for the reason of complexity and sheer size of the code, which go hand in hand. We know by now that there are more bugs in higher complexity software. There is direct correlation between the size of the code, complexity of the code, and the amount of bugs potentially present there. Some of those bugs are going to be relevant for security. The bad news is that quite often what cannot be achieved by an attacker through using a single bug, can be achieved through using a combination of bugs. A bug in one place could bring a system to an inconsistent state that could allow an attack somewhere else. And the more complicated the software becomes, the more “interesting” combinations of bugs there are. Especially when the software is over-engineered it allows for a much wider range of functionality to be accessed through security bugs.
As for the maintainability, an over-engineered system becomes harder and harder to maintain because it is not a straightforward implementation of a single concept. Anyone trying to update such software would have a hard time understanding the design and all implications of the necessary changes. I came across a serious problem once, where a developer had to inherit a class (in Java) to add additional security checks into one of the methods. By doing so, he actually ruined the complete access control system. That was unintended, of course, but the system was so complicated it was not apparent until it turned out in penetration testing that one could now send any administrative command to the server without authenticating and the server would happily comply. When the problem was discovered, it became apparent in hindsight, of course. However, the system is so complex, that it requires non-trivial amounts of effort to analyze impacts of changes. Any small change can easily turn into a disaster again.
The complex, non-maintainable code riddled with bugs becomes a security nightmare. The analysis of over-engineered software is not only expensive, it becomes sometimes simply impossible. Who would want to spend the next hundred years to analyze and fix a piece of software? But that is what I had to estimate for some of the systems that I came across. For such a system, there is no cure.
So, the next time you develop a system, follow the good old KISS principle: Keep It Simple, Stupid. If it worked for Kelly Johnson, it will work for you, too. When you maintain the code and make changes, try to bring the code size down and reduce functionality. Code size and complexity are directly correlated, so decreasing the KLOC count is a good indicator.