I stumbled upon the blog post "Is it important to write good code?" the other day, and became more and more ill at ease as I realized that I thought that I preferred the original code, that the author was trying to ridicule, over his new "improved" object-oriented version. At first I guessed this was another manifestation of the the "Worst is best" scenario - enhancements are often not worth the added complexity - but I realized that it was perhaps a more profound factor:
The original code is very good because it ... is small!. It fits on a teminal screen, so an human being can read it at once and have less items to maintain in his short term memory and also understand it easily because it follows a natural way of thinking with sentences using IF. This becomes obvious by reading the body of the blog post surrounding the code samples, where you can see that the author is using phrases such as "if I need this I do that", showing that in plain english, the if statement is the best way to make people understand what you mean. And making code that people understand is the best way to make debuggable and maintainable code.
At this moment I noticed the citation in the blog header: "Good programmers write code that humans can understand"
Indeed
PS: I know I am a bit exaggerating the issues there, and that I unfairly nitpick on Fredrik Normé, but it is that it seems to me from my personal experience that the two changes I see most in my coding efficiency as I grow older is a decrease of my short term memory capacity, and that I make more and more typos where I realize I mix up totally words with totally different meanings but that sound the same, for instance writing "never" instead of "nether", making me suspect that our natural way of thinking may be much more language-based that I imagined...