The Act of Programming Has Not Changed In Decades

November 26, 2011

I have basically programmed the same way since my first Basic program written on a teletype in high school back in 1974.

Sure it sounds like I am a dinosaur out of touch with the modern programmer. To make it clearer the approach I take to programming is the same, what has changed are the tools, the languages, the platforms, the targets; basically everything but the way I work on code and look at the problem to be solved.

At my first professional job in 1981 I wrote in Fortran on a terminal. There where no windows, no mice, the Web was 10 years away, no email (a few executives had email but usually had their secretaries type in in), no OO programming, no bitmapped displays, no one had a personal computer on their desk, there were few languages; even Computer Science was a strange concept to most people. I had to write code on paper sometimes since there were seven of us sharing the one terminal when I started. The "old guys" worked on batch applications with Fortran and JCL - they worked on many applications at once and each day submitted all the jobs late in the day, and got huge paper results the next morning.

Yet despite all that antiqueness, the act of working out how to solve the problem or deliver the end result with acceptable quality and in a reasonable time is exactly the same as today. The tools, languages, frameworks, targets etc. are all radically different and in fact have changed continuously since then. Even the knowledge is radically different. Yet I still write code, evaluate it, modify it, debug anything that goes wrong, try to plan on future changes as best as possible and try to get it finished in a decent time.

My coding philosophy when I started in high school was "Code a little, test a little, code a little more, test a little more" and that is 100% of how I do it today. 37 years later it still allows me to write quality code, solve complex problems and get things done. Additionally I constantly look for new things to add or take away from what I do but the skeleton remains the same.

Over the years I went from Basic to APL (a little) to Fortran to Jovial J73 to Assembly to Pascal to C to C-with-objects to C++ to Objective-C to Java/Javascript to PHP/Javascript to Ruby (a little) back to C/C++ then to C/C++/Objective-C again.

I went from a Teletype/Timeshare to Apple ][ to CPM to IBM Mainframe to Harris Supermini to Apple III to Lisa to Mac for a long time to NeXTStep to a PC with various Windows to OSX to both to iOS.

Despite all those changes I still code like I did when I started on that Teletype all in uppercase. I still refactor a lot (though for 20 years I never heard of that word). I write dynamic tests along with the code (but not static ones unless it's a framework). I rarely design a lot before I start -- over the years I learned that premature design is one root of all evil. Rarely do you know enough at the start to plan out the entire application unless it's in a domain you've done before. Even when a corporate waterfall demands you get sign off from a customer the end result never looks like it unless you cram it down their throats. Another thing I learned early on that is still true today: customers don't know what they want much less what they need until they see it.

It does bother me that people insist that only the latest and greatest "one true way" to programming can work and everything else is doomed to failure -- something I've heard repeatadly since I first started. I have been around long enough to know that quality over the 30 years hasn't changed at all. Bugs, broken features, security failures, bad design, scalability issues, maintainability still plague the world at about the same rate. Of course today's software is likely more complex and larger, more interactive both with people and other systems, and have greater security concerns than 30 years ago or even 10 years ago. So there is a bigger playing field to worry about which of course does require new ways of thinking. Over the years however I believe (though I don't know how to measure it) that the size of application code people write has not grown that much due to the availability of so many frameworks and systems we build upon. Applications may be bigger with many more features but that is balanced with better tools, languages and those frameworks doing more for us.

I would say that although source is more complex today it isn't that much bigger on average. Of course we have new types of applications today besides native ones such as web applications. A web application wasn't in anyone's mind when I started. How do you compare the complexity of something like Reddit to a Cobol insurance system from the 1960's? Or Google's search engine to the SABRE Reservation System from the same era?

Funny thing is that the most productive I have ever been as a programmer was using Borland's Turbo Pascal back in 1984 on an early IBM PC/XT. Today the computers are many orders of magnitudes faster but the effort of writing code isn't really any faster and the tools get in the way more. Of course I need to know orders of magnitudes more information to write anything these days so even with electronic documentation and Google (the single greatest programming implement ever devised) I can't write code as fast as those early days.

The act of programming is not really going to change much until we write programming software that can do the job better than people can. But that's another article.

OK, time to code a little and test a little...