Sometimes the obvious needs to be stated so that it becomes obvious and allows us to think more clearly about the consequences of the obvious.
Try this post at AddRef that makes the very clear case that classes are typically only available for re-use when compiled into a DLL or LIB and that most classes get “lost” to future use by being compiled into executables. The author is stating an obvious truth about how we actually build and implement classes as developers. The consequence of that truth is that we have to recognise that “in practice” we are re-using classes by copying and pasting them from one source code location to another – thus “forking” not re-using them in the true sense. The author of the post is right in thinking that re-usable code should be drawn off into a library even if that means that a good many executables contain little more than the UI functionality – but that’s not what happens.
So next time you hear someone arguing the benefits of OOP (and maybe design patterns) probably citing code re-use as a prime example – tackle them on the architectural implications and how they are crucial in delivering those benefits. As far as I can see OOP has delivered a lot less than was expected and that may be partly a function of a larger design pattern than those typically encapsulated by a class.
Continuing the theme you will probably enjoy a paper written by Brian Foote and Joseph Yoder on the most common design pattern of all – then one the paper’s authors title “The Big Ball of Mud”. The authors make the case that the deliberate or accidental exclusion of a planned (and implemented) architecture from most systems is a deliberate pattern that reflects the reality of the software development environment. The paper is well worth reading – as are the conclusions The chief conclusion from the paper is that developers build “Big Balls of Mud” because they work. I saw something else in the paper – I thought that the most interesting idea was that design/architecture was a project cost overhead that rarely made the budget. This being compounded (remember Brooke of the Mythical Man Month) by the fact that a good design often (maybe always) requires you to throw away your first effort and start again. Good design is expensive in time and money.
I was analysing a fairly typical Big Ball of Mud yesterday with the purpose of documenting the key functionality in order to help a customer decide if they should replace the existing system – and if so – what manner of replacement might best be implemented. I have a funny feeling though that the best option might be to slap a little lipstick on the pig and keep this fifteen year old system running a while longer. You see it’s not really broken at all and maintenance costs are tiny – usually associated with “cock-ups” in other systems and occasional small tweaks following the acquisition of other businesses – and thus inheriting some of their contractual agreements.
Corporate software is surprisingly long lived. It often outlives the original hardware and (this case in point) runs smoothly on as more recent generations of development tools and design methodologies have come and (in some cases) gone. I wish I could surmise that it is only systems with at least some past quality at their core that endure as (or to become) those “Big Muddy Balls” but I can site too many extant examples that should have been strangled at birth but have lived on for decades – their replacement costs always exceeding the perceived costs of their inefficiencies.
I am not quite sure what all this is telling me. Is “good design” worth the effort or is it just a way of engaging developers in their task? Do we overrate design as an attribute? Can I come up with a new development methodology that sets out to produce cheap and cheerful “Balls of Mud” and thus make a killing in the big name consultancy market? Maybe it’s just that Friday afternoon feeling again.