The Evolution Of A Programmer

January 25, 2007

Over the years I have always adopted the philosophy that I should stay ahead of or be at the edge of the programming mainstream to ensure I would never become obsolete. I have watched numerous people over my career be forced to seek alternative employment as the technologies they singularly specialized in fell into obscurity.

The nature of programming is that the only thing that doesn't change is change itself. New languages, frameworks, methodologies and ideas appear constantly, more so than almost any other profession. Although you can't learn even a fraction of them, you must keep up continuously or risk becoming useless. Yet going to far forward or delving too much into alternatives can also poison your resume such that finding a new position is difficult (which I am sadly in the midst of).

Two friends that graduated in the same timeframe as me got computer science degrees (I didn't) and one specialized in Cobol and the other mainframe systems programming. Over the years they continuously worked in these areas until both discovered their skills were no longer employable. During this time neither had learned anything new and had to change to a new career path (one management, the other networking). Of course there is nothing wrong with finding a new type of job path, but both were shocked when they realized their skill-set was obsolete. Learning new things is a must for any programmer these days.

I remember buying my first C compiler when everyone I worked with (at General Dynamics) was using Fortran and Jovial and the team I worked with used Pascal and assembly. Even at that time I read every tech magazine (oh how much easier it is today) I could get my hands on. I learned of object-oriented programming from Byte (the famous issue on Smalltalk) and some promotional materials on Objective-C (or some precursor) and wrote my own object extensions to C. Once C++ was available I delved into that. When Java was released I picked it up too.

Of course over the last 10 years or so you can't try everything anymore, you have to be more selective as the pace of evolution has become incredible. No one can possibly learn and evaluate and master even just the new (or newly popular in the last ten years in many cases; many languages are much older than you think) languages like Ruby, Erlang, Haskell, Python, etc. Even in the Java world the number of technologies appear almost infinite, and lately it seems that many technologies no longer rest in a single universe; witness things like JRuby and many of the Ajax frameworks that work with multiple languages and server environments.

How the hell do you keep up today?

  • You can specialize in a few technologies (old or new) and hope for the best; I know lots of people who do this.
  • You can pick a broad set of technologies and hope you can find cutting edge employers.
  • You can give up and do something else.

Sadly there is no perfect solution. If you choose a set of technologies (like standard J2EE) you are assured of a job for a period of time but someday you will be "cobolized". If you pick cutting edge stuff like Ruby, the number of employers will be slim unless and until the technology becomes broadly accepted. Of course it may be regionally a good choice, you are more likely to find cutting edge employment in Silicon Valley than in (sadly where I live) the Dallas-Fort Worth area for example. You might learn a broad set of technologies but then your resume looks like you don't really know anything (again I resemble that a little too much) even though it really means you are able to learn anything your employer throws at you.

The real lesson here is that to be a professional programmer as a career means you must learn something new almost every day. This is true in many professions but in this one there is no alternative. If you don't keep up the programming steam roller will eventually flatten you.

The trick is to stay at the cutting edge but still have a job. Ouch.