I Am Happy Not To Be A Web Developer Anymore
I wrote my first single-page web app in Javascript in 2005, right after learning about XMLHttpRequest and before any serious frameworks existed. I left professional web development behind around 2009 (I started in 1997 with WebObjects) and spent the final decade of my career doing mobile.
I look at the world of web development today, and it's mind-boggling how insane it is. There are so many web frameworks, with new ones appearing every day. Building a web application (as opposed to a website, like my Art website, which is statically generated and has only a little Javascript) often requires myriads of tools and technologies, which often change at a high frequency and feature endless amounts of other technologies you usually don't even know are there (ooh look, the package directory has 2,000 items). Javascript is a terrible language that was never designed for any of this, yet oddly, it became popular because it was always there. It's amazing how much innovation has gone into building out today's web development universe despite the somewhat shaky foundation it is built upon.
In my first mobile job at an online travel company (now sadly just a brand), the web team evaluated web frameworks for a new mobile web app (hotel/flight/car booking). In the end, they picked what they thought would work best. Yet a year later, the open source framework was mostly abandoned—thankfully, it didn't matter since our brand had been sold, and all technology and the entire company were gone. Today, I can't even imagine how many choices there are to pick from, with new ones always appearing. Some of them are entire ecosystems, and often, you choose one, and then changing becomes impossible—the company investment is too significant. As a programmer, you become attached to one, and switching can become difficult as employers often want people with experience in their chosen ecosystem. Yet, specializing in one could put you in a difficult spot in your career if it becomes obsolete or even abandoned.
I remember the first two single-page apps I built; it was fun, and I needed very little to build them beyond a text editor and a browser (getting data did involve a little marshaling framework and some Java code). The architecture board I was part of complained that I had bought some unapproved technology until I spent a lot of time showing them that it was just the browser and Javascript. My customers (internal only app) liked that they could search for information quickly as if it was a desktop app instead of constantly reloading the page. These were not huge applications like Gmail, of course.
Today, unless you are a tiny team building small apps, you probably invest in picking an ecosystem like React, Angular, or Vue or combining smaller frameworks with other tools to roll your own environment. You must worry about CSS frameworks, asset packers/assemblers, and many other open-source frameworks and utilities, which are built on layers of yet more open-source items. Now, you must keep everything updated and avoid incompatibilities and security holes.
What a pain in the ass all of this has to be. Of course, it's job security!
Programming when I started in the early 80s was much more straightforward, as you needed to know the programming language, the operating system, and what you were asked to build—everything else you had to invent for yourself! I remember what a big deal it was for Trapeze (a spreadsheet-like application we shipped in January 1987) when we got a copy of Lex and Yacc from a friend with access to Unix to make our formula parser with. Today, almost everything you need is available somewhere in open-source form. Yet, it might become unsupported, suffer from bugs you might not know about, and become incompatible with another open-source item you need.
The final large project I shipped at my last job had no open-source elements; it was all pure Swift. Our legal department was hard to please, and it was easier to build everything ourselves than deal with them.
It's not that you can't build complex things and maintain them with today's web framework ecosystems. Still, it can't be much fun, and I wonder how the applications will evolve, given their ecosystems are constantly changing. People still use and support Cobol programs almost as old as I am, so anything is possible. I wonder if AI will eventually be able to deal with the constant churn in infinite layers of your web environment or will give up and do humanity in.
Programming has always changed, and change is hard to adapt to. I have always specialized in building new things instead of maintaining old ones, often in companies or industries where new markets or technologies require new applications. I cannot remember ever dealing with so many new technologies that are constantly evolving and continuously having to imagine where they will be next year, much less the next decade. Security is a huge issue today, which was not the case in the first half of my career. Someone has to worry about all of this and make choices that will not cause issues, perhaps years later. How long will the application you write today have to live with the same source code using the framework you picked initially?
The application I started in 1988, Deltagraph, survived on the same source code for thirty years (my team only worked on it for five years) before the pandemic killed the last company that sold it. Ultimately, it no longer ran on MacOS as the quarter-century-old C source code could not be updated to 64-bit. I had no idea in 1988 that the decisions I made back then would impact someone three decades later. The main web application from my last employer was an amalgamation of way too many technologies that made little sense when I first saw it. I wonder if many web apps these days will become piles of primordial ooze over time. People hate rewriting large things, but the cost of supporting the unsupportable eventually becomes too high (another blog post to be coming soon). With web applications built on such massive piles of constantly changing technologies, I wonder if the pile will eventually collapse.
Maybe AI will eventually take this burden away and allow programmers an easier time to focus only on essential tasks. I am still skeptical; it may happen eventually, but I don't see an AI being able to build a complex application by itself anytime soon.
Programming to make art is almost like returning to the 1980s for me. I don't use many open-source packages beyond Swift math libraries, and most of the code I write is not something you would find anywhere. Besides the occasional XCode and Swift version releases, I am mostly insulated from massive changes. The most irritating thing is adding features to my site generator and website, where I get a little indication of complexity in keeping things updated, but it's not my everyday job. I don't envy today's web developers the task of keeping up with constant change and evolution, security, and the occasional left-pad bomb. Even dealing with Github is becoming irritating.
How many web-related frameworks are there today? You can't answer, as the count just changed!