<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[The Codist]]></title><description><![CDATA[Four Decades Of Learning, Experiences, And Adventures]]></description><link>https://thecodist.com/</link><generator>Ghost 5.59</generator><lastBuildDate>Fri, 23 Feb 2024 08:03:10 GMT</lastBuildDate><atom:link href="https://thecodist.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[If You Don't Give A Crap, This Is the Shit That You Get]]></title><description><![CDATA[<p>Being retired after four decades as a programmer, there is nothing more irritating than seeing broken or poor functionality in web and mobile apps. I always cared about what we were putting out, even if it was sometimes unimportant to my employer. When I see things that are easy to</p>]]></description><link>https://thecodist.com/if-you-dont-give-a-crap-this-is-the-shit-that-you-get/</link><guid isPermaLink="false">65a93bf04068b1047e7504b7</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Thu, 18 Jan 2024 15:50:41 GMT</pubDate><media:content url="https://thecodist.com/content/images/2024/01/outhouse.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2024/01/outhouse.jpg" alt="If You Don&apos;t Give A Crap, This Is the Shit That You Get"><p>Being retired after four decades as a programmer, there is nothing more irritating than seeing broken or poor functionality in web and mobile apps. I always cared about what we were putting out, even if it was sometimes unimportant to my employer. When I see things that are easy to do correctly, and I can&apos;t do anything to fix them, it&apos;s really frustrating.</p><p>Why does bad UI/UX persist? I&apos;m not talking about things that could simply be opinions but bad coding, bad testing, bad concepts, bad management, or even laziness.</p><p>Last year, the major bank where I have a credit card decided to redo the UI on their website. Previously, I could log in, yet with the new UI, hitting the log-in button resulted in an error message insisting that my browser was too out of date&#x2014;the link provided as an explanation led to a template page with no content besides the template. I used the latest Safari on the latest MacOS, on Apple&apos;s most powerful Mac.</p><p>Naturally, the bank provided no email address to report errors, so I emailed the CEO (most CEOs have a staff to deal with these emails). They told me my browser was too old, which was incorrect, so I replied with a detailed list of everything I had with version numbers. I also dug into their horrific Javascript (a 100,000+ line file), trying to figure it out. Eventually, they reported that it was a bug, and two months later, it was fixed. I would have thought that bugs on a bank log-in page would be a little more critical to test for.</p><p>I was going to book on an airline that I have never used. Filling out the booking information was an exercise in not grinding my teeth down completely, as there were no autocomplete types on any fields, and it requested the same information be repeated (such as the address) multiple times, each requiring me to type manually. When I finally made it through, the booking failed with a meaningless error message with a hexadecimal code. On the booking page for an airline, where everyone gives them money! It happened multiple times, so I tried another credit card, manually entering the same address again, with the same result. I then tried the mobile app, where instead of an error message, it loaded the credit card&apos;s bank app to verify, as if the airline was a deadbeat. </p><p>None of the cruises I have been on have a decent app. During my summer cruise to Alaska, the cruise line app was clearly a hybrid app (i.e., a web app) and was horrific. On a bus during an excursion, I asked people what they thought of the app, and it was universally panned. A pre-cruise requirement form was so bad on mobile that it had incorrect data types for form fields, bringing up incorrect keyboard types. In one case, entering a year had a keyboard with no delete, and I could not exit the field because I picked a value not accepted by the field; I had to kill the app to start over. During the cruise, dinner reservations often failed with a &quot;server undergoing maintenance&quot; at dinner time, so I had to guess which dining room to go to. When I asked the dining room person about it, they rolled their eyes and indicated this was a common problem.</p><p>I have seen many autocomplete failures; today, I was on a form (another cruise line) where every single field was set to &quot;cc-number,&quot; which clearly is a copy-paste bug. This is surprisingly common as if no one ever tested the form at all.</p><p>UX problems are the worst, however. UI are the fields on a page; UX is why you have those fields. UX is also essential in a set of forms where an app collects information over a collection of pages. Someone who understands how to design such a data collection flow will try to minimize the amount of duplication and make the data as easy as possible for customers to enter. People who don&apos;t care, have no understanding or aren&apos;t even there (sometimes forms are designed by programmers or others without much input from anyone knowledgeable) may create forms that are difficult to use. </p><p>I wonder how these types of things happen. Is it bad programming, bad management, bad or no QA, bad project management, bad process, or something else? I have seen all those during my career; if I could influence anything, I always tried. Sometimes, it&apos;s an unwillingness to spend money or unreasonable deadlines&#x2014;but many of these issues are not all that difficult to do correctly or test. You have to care to produce a quality result. I spent most of my programming career building new things from the beginning; I refused to ship anything that didn&apos;t meet a high level of quality. The idea of shipping a barely usable form and not caring was not something I could tolerate.</p><p>One time, while looking for a mortgage, I started a form that collected all the data necessary to determine if I qualified. On the second page, it asked for something I did not have available immediately, so I decided to return later. However, it would not let me exit the field, save the form, go forward and skip the field, or go back to the previous page without starting the whole form over. I gave up and called the office, and they admitted their parent company had saddled them with this form despite it hardly working for anyone. Why even bother?</p><p>I could go on, but I won&apos;t. My biggest desire as a programmer was to fix broken things (both a good and bad trait), and I can&apos;t fix anything now. I can complain, but often you get thanked, and nothing changes because either they don&apos;t care, aren&apos;t empowered, or it&apos;s a deliberate decision they would rather not admit. </p><p>Please do give a crap as best you can, even if your employer doesn&apos;t.</p>]]></content:encoded></item><item><title><![CDATA[A Programming Career By The Numbers]]></title><description><![CDATA[<p>As I continue to recover from some health issues that kept me from writing, I thought it might be interesting to describe my long career with numbers. If you wind up working for four decades, your experience may vary.</p><p><strong>Years Active: 1981-2021</strong>, totaling 39.5 years. Irrespective of my title</p>]]></description><link>https://thecodist.com/a-programming-career-by-the-numbers/</link><guid isPermaLink="false">659d7d1d4068b1047e750208</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Thu, 11 Jan 2024 23:39:10 GMT</pubDate><media:content url="https://thecodist.com/content/images/2024/01/numbers.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2024/01/numbers.jpg" alt="A Programming Career By The Numbers"><p>As I continue to recover from some health issues that kept me from writing, I thought it might be interesting to describe my long career with numbers. If you wind up working for four decades, your experience may vary.</p><p><strong>Years Active: 1981-2021</strong>, totaling 39.5 years. Irrespective of my title or duties, I was always writing code.</p><p><strong>Number Of Interviews For My First Job: 1. </strong>I had no programming work experience nor any college-level programming classes, yet I had one interview with the hiring manager, who offered me a job on the spot. In 1981, knowing how to program at all was enough to be hired; this was long before specialization was a thing. I wouldn&apos;t even get a phone screen today, yet this one interview catapulted me for four decades! My first project was writing a source code formatter for a programming language I didn&apos;t know (Jovial), in a programming language I didn&apos;t know (Fortran 77), writing a parser, a thing I had never heard of. Yet I got it done, and it worked. &#xA0;</p><p><strong>Number Of Distinct Jobs: 15</strong>. That includes the two companies I started and ran for nine years (1985-1994). </p><p><strong>Shortest Job Duration: 2 weeks</strong>. Short-term contract to help a company finish testing and bug fixing in the mid-90s. Despite the shortness of the contract, the entire team took me out to lunch on my last day as I had found and fixed a major bug that had plagued them for several releases.</p><p><strong>Longest Job Duration: either 5.5 years or 6</strong>. I am not sure which because I don&apos;t recall the duration of my second little company. The 5.5 years was my final job before retiring.</p><p><strong>Largest Employer: my last position</strong>. This company has more employees than all my other employers combined. I got this position because my manager at a previous job begged me to come to work for him again, and all the interviews were also begging. I worked more hours here than anywhere besides my two little companies. The politics was painful, but the work was rewarding, and I made a real difference. I only retired because, at 63, the hours were just too much; I probably could have worked there until I died.</p><p><strong>Smallest Employer: my second company, The SU5 Group</strong>. We had five people, including me, but we worked on Persuasion 1.0 for the primary author (the only real competitor Powerpoint ever had) and Deltagraph (5 years) for the publisher, which led the market for charts/graphs for a long time. Deltagraph continued (same source code) until the pandemic appeared to kill the last company that owned it.</p><p><strong>Number Of Programming Languages: 11</strong>. Professionally, I worked in the following languages, working on at least one project: Fortran, Z8000 Assembly, Mil. Std. 1750A Assembly, 6502 Assembly, Pascal, C, C++, Objective-C, Java, JavaScript, Swift. Some I returned to after a gap, like Objective-C with WebObjects and later in iOS. In addition, I used XSLT for a year, but it&apos;s not a language; it&apos;s a travesty. You could add SQL and XML variants, but are those languages? I also wrote one interpreted programming language around 2000, which was used to code phone polls (the runtime engine was also written in it).</p><p><strong>Number Of Times Laid Off: 3</strong>. In the first one, the US offices of an international consulting firm were closed, and everyone canned. The second was a civil war between two offices, and most technical people were eliminated in the losing office. The third was the online travel agency I worked for, where our parent company sold the brand to our biggest competitor; all 1000 employees were canned, and I was retained until the end to ensure the shutdown (for extra money).</p><p><strong>Number Of Times Employer Went Out Of Business: 1</strong>. The consulting company went out of business at 4:30 PM on a Thursday, victim of the Dot-Com explosion.</p><p><strong>Number Of Hostile Workplaces: 2.</strong> Two jobs in a row in the mid-2000s both turned hostile, the first because I pointed out that the only other programmer had never added anything to the source repository or shown any work despite working for ten months (I figured he was doing contracts for other companies at work), while I had no only code but even demo sites running. No one knew enough to criticize since I was the only other programmer. The manager made a classic statement: &quot;Oh, he never checks anything in until it&apos;s perfect.&quot; Somehow, they believed him and not me, and after that, I was persona non-grata. After I quit in disgust, another year passed before they fired the programmer and the manager. In the end, they lost a huge business opportunity.</p><p>The second was the intra-office war, where people in the other office attempted to get me to quit, and when I didn&apos;t, they forced the CTO (in my office; he left soon after, I think) to lay me off. I wasn&apos;t the only person in the office this was done to. After these two jobs, I was disgusted with working, leading to the following.</p><p><strong>Number Of Jobs I Took Just For Fun: 1. </strong>I went to work for a poor game company that could only pay me 1/3 of what I had made before. I played the game (a WW2 MMO) and already knew the people, and they me. The game engine was archaic and painful to work with; we had few people but a loyal (small) customer base. It was some of the most challenging programming I ever did, <a href="https://thecodist.com/fixing_a_nasty_physically_modeled_engine_bug_in_an_fps_game/">but I got to fix many broken things</a> (and got much thanks from the players, who all knew me).</p><p>The best part of the job was fighting the company that sold a cheat for the game. I made the programmer&apos;s life there a living hell, and eventually it was not worth the effort to get around all my blocks; plus I was able to identify everyone who used it including the programmer. It turned out that despite the playerbase&apos;s fears that everyone was using it, it was only a tiny number. I paid to be a customer so I could read their forum, laugh at all their smug talk about how stupid we were, and then later have them wonder why they were all getting banned. You rarely get to do such a fun, competitive project! I only quit when I couldn&apos;t afford to make so little money.</p><p><strong>Number Of Mobile Jobs: 3. </strong>My first mobile job (iOS 5 timeframe) was at an online travel company (brand still exists). Their iOS app was one of the first travel apps in the App Store (written by two managers with no idea how to write in Objective-C) and had 9,000 one-star reviews in the App Store when I started. The team was small; we were well-liked by the CEO since we made money and shipped at a blistering pace. The rest of the company barely released things three times a year. It was a fun group of people to work with, and I could make a difference, but it ended when our parent company sold us to our biggest competitor, and everyone was laid off. The manager went to what would be my final employer and asked me to join as soon as he could arrange it (it took 18 months).</p><p>The second was a large construction company that had only internal apps. More on them below. It was to keep busy while waiting on that final employer&apos;s bureaucracy. </p><p>My final employer was a large company with many divisions, and we were an essential part of two of the more significant segments. I was responsible for about 20% of the two apps (just iOS, as Android was a separate team). It was a ton of work since my team was small, and I was always a full-time programmer and full-time as lead. </p><p>It was rewarding because what we did was highly strategic, and everything we did was well-known by even the top of the executive tree. Everything we shipped had high quality, and we had a good reputation despite people wondering how we did so much work with so few people. Still, the long hours (being a full-time programmer, attending all meetings, and overseeing so many moving parts got old after a few years).</p><p>Everything I or my team wrote was in Swift. We replaced the Objective-C codebase I inherited as part of a larger project.</p><p><strong>Number Of The-Goggles-They-Do-Nothing Awards: 2. </strong>I saw some poor code in my four decades, but two get an award for over-the-top crappiness. The first was a help desk system Mac client, written as a single .c file, 29,000 lines long with a 14,000 line event loop indented 2.5 displays wide. The file was so long that the editor (Codewarrior) could barely handle single characters at the end. It took three of us three months to untangle it and make it a functional app and source.</p><p>The second was three apps I was handed at the Construction company (next to my last employer). They had hired some US company (not known for mobile) who hired an Indian company who, I presume, hired random people who started the first app, then they hired two more teams to continue with that code to make all three (divergent). The company accepted the code but never looked at it. People with no experience with Objective-C or iOS wrote it. The company paid something like $450K for these things; I could have written them myself in 3 months; they were nothing special, just a horror show. I fixed the main one and added a significant expansion once it was workable.</p><p>Four decades is a long time to do anything; you see all sorts of projects, companies, and industries and meet all kinds of people, good, bad, and mediocre. I can&apos;t say what will happen in the next four decades; perhaps programming will vanish entirely, though not anytime soon. Despite the hype of AI replacing everyone, today&apos;s AI isn&apos;t all that great at anything and is not &quot;Intelligent&quot; enough to completely replace programmers. Employers want it to, but that&apos;s a pipe dream for now. Initially, I intended to pursue a Ph.D. in Chemistry (was accepted) and work in programming for only a couple of years, but I stuck around.</p><p>Dealing with change is the biggest challenge, and that will never change. It accelerates as you move forward, and specialization is the typical result. When I started, change was manageable; the main difficulty was knowing what was changing since that was before the internet and all the information you have today. Every programmer I knew back then wound up their career either losing it, giving up on programming, or being stuck on legacy projects. I managed somehow to stay relevant for the entire four decades. </p><p>Four decades is a long time, but it was a fun ride. Today, I make <a href="https://digcon.art/?ref=thecodist.com">digital art using Swift</a>. &#xA0;</p>]]></content:encoded></item><item><title><![CDATA[My Wikipedia Entry For Trapeze]]></title><description><![CDATA[<p>Maury Markowitz wrote up the story of Trapeze (<a href="https://thecodist.com/the-story-of-my-first-startup-30-years-ago/">covered earlier in this blog</a>) on <a href="https://en.wikipedia.org/wiki/Trapeze_(spreadsheet_program)?ref=thecodist.com">Wikipedia</a>.</p><p>It sure seems like a long time ago. </p>]]></description><link>https://thecodist.com/my-wikipedia-entry-for-trapeze/</link><guid isPermaLink="false">6537114a6b79e907fc90d737</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Tue, 24 Oct 2023 00:48:44 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/10/trapeze.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/10/trapeze.jpg" alt="My Wikipedia Entry For Trapeze"><p>Maury Markowitz wrote up the story of Trapeze (<a href="https://thecodist.com/the-story-of-my-first-startup-30-years-ago/">covered earlier in this blog</a>) on <a href="https://en.wikipedia.org/wiki/Trapeze_(spreadsheet_program)?ref=thecodist.com">Wikipedia</a>.</p><p>It sure seems like a long time ago. </p>]]></content:encoded></item><item><title><![CDATA[I Am Happy Not To Be A Web Developer Anymore]]></title><description><![CDATA[<p>I wrote my first single-page web app in Javascript in 2005, right after learning about XMLHttpRequest and before any serious frameworks existed. I left professional web development behind around 2009 (I started in 1997 with WebObjects) and spent the final decade of my career doing mobile.</p><p>I look at the</p>]]></description><link>https://thecodist.com/i-am-happy-not-to-be-a-web-developer-anymore/</link><guid isPermaLink="false">650444976b79e907fc90d4c3</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Fri, 15 Sep 2023 13:56:47 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/09/head2.png" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/09/head2.png" alt="I Am Happy Not To Be A Web Developer Anymore"><p>I wrote my first single-page web app in Javascript in 2005, right after learning about XMLHttpRequest and before any serious frameworks existed. I left professional web development behind around 2009 (I started in 1997 with WebObjects) and spent the final decade of my career doing mobile.</p><p>I look at the world of web development today, and it&apos;s mind-boggling how insane it is. There are so many web frameworks, with new ones appearing every day. Building a web application (as opposed to a website, like <a href="https://digcon.art/?ref=thecodist.com">my Art website</a>, which is statically generated and has only a little Javascript) often requires myriads of tools and technologies, which often change at a high frequency and feature endless amounts of other technologies you usually don&apos;t even know are there (ooh look, the package directory has 2,000 items). Javascript is a terrible language that was never designed for any of this, yet oddly, it became popular because it was always there. It&apos;s amazing how much innovation has gone into building out today&apos;s web development universe despite the somewhat shaky foundation it is built upon.</p><p>In my first mobile job at an online travel company (now sadly just a brand), the web team evaluated web frameworks for a new mobile web app (hotel/flight/car booking). In the end, they picked what they thought would work best. Yet a year later, the open source framework was mostly abandoned&#x2014;thankfully, it didn&apos;t matter since our brand had been sold, and all technology and the entire company were gone. Today, I can&apos;t even imagine how many choices there are to pick from, with new ones always appearing. Some of them are entire ecosystems, and often, you choose one, and then changing becomes impossible&#x2014;the company investment is too significant. As a programmer, you become attached to one, and switching can become difficult as employers often want people with experience in their chosen ecosystem. Yet, specializing in one could put you in a difficult spot in your career if it becomes obsolete or even abandoned.</p><p>I remember the first two single-page apps I built; it was fun, and I needed very little to build them beyond a text editor and a browser (getting data did involve a little marshaling framework and some Java code). The architecture board I was part of complained that I had bought some unapproved technology until I spent a lot of time showing them that it was just the browser and Javascript. My customers (internal only app) liked that they could search for information quickly as if it was a desktop app instead of constantly reloading the page. These were not huge applications like Gmail, of course.</p><p>Today, unless you are a tiny team building small apps, you probably invest in picking an ecosystem like React, Angular, or Vue or combining smaller frameworks with other tools to roll your own environment. You must worry about CSS frameworks, asset packers/assemblers, and many other open-source frameworks and utilities, which are built on layers of yet more open-source items. Now, you must keep everything updated and avoid incompatibilities and security holes. </p><p>What a pain in the ass all of this has to be. Of course, it&apos;s job security!</p><p>Programming when I started in the early 80s was much more straightforward, as you needed to know the programming language, the operating system, and what you were asked to build&#x2014;everything else you had to invent for yourself! I remember what a big deal it was for Trapeze (a spreadsheet-like application we shipped in January 1987) when we got a copy of Lex and Yacc from a friend with access to Unix to make our formula parser with. Today, almost everything you need is available somewhere in open-source form. Yet, it might become unsupported, suffer from bugs you might not know about, and become incompatible with another open-source item you need.</p><p>The final large project I shipped at my last job had no open-source elements; it was all pure Swift. Our legal department was hard to please, and it was easier to build everything ourselves than deal with them. </p><p>It&apos;s not that you can&apos;t build complex things and maintain them with today&apos;s web framework ecosystems. Still, it can&apos;t be much fun, and I wonder how the applications will evolve, given their ecosystems are constantly changing. People still use and support Cobol programs almost as old as I am, so anything is possible. I wonder if AI will eventually be able to deal with the constant churn in infinite layers of your web environment or will give up and do humanity in. </p><p>Programming has always changed, and change is hard to adapt to. I have always specialized in building new things instead of maintaining old ones, often in companies or industries where new markets or technologies require new applications. I cannot remember ever dealing with so many new technologies that are constantly evolving and continuously having to imagine where they will be next year, much less the next decade. Security is a huge issue today, which was not the case in the first half of my career. Someone has to worry about all of this and make choices that will not cause issues, perhaps years later. How long will the application you write today have to live with the same source code using the framework you picked initially? </p><p>The application I started in 1988, Deltagraph, survived on the same source code for thirty years (my team only worked on it for five years) before the pandemic killed the last company that sold it. Ultimately, it no longer ran on MacOS as the quarter-century-old C source code could not be updated to 64-bit. I had no idea in 1988 that the decisions I made back then would impact someone three decades later. The main web application from my last employer was an amalgamation of way too many technologies that made little sense when I first saw it. I wonder if many web apps these days will become piles of primordial ooze over time. People hate rewriting large things, but the cost of supporting the unsupportable eventually becomes too high (another blog post to be coming soon). With web applications built on such massive piles of constantly changing technologies, I wonder if the pile will eventually collapse.</p><p>Maybe AI will eventually take this burden away and allow programmers an easier time to focus only on essential tasks. I am still skeptical; it may happen eventually, but I don&apos;t see an AI being able to build a complex application by itself anytime soon.</p><p>Programming to make art is almost like returning to the 1980s for me. I don&apos;t use many open-source packages beyond Swift math libraries, and most of the code I write is not something you would find anywhere. Besides the occasional XCode and Swift version releases, I am mostly insulated from massive changes. The most irritating thing is adding features to my site generator and website, where I get a little indication of complexity in keeping things updated, but it&apos;s not my everyday job. I don&apos;t envy today&apos;s web developers the task of keeping up with constant change and evolution, security, and the occasional left-pad bomb. Even dealing with Github is becoming irritating.</p><p>How many web-related frameworks are there today? You can&apos;t answer, as the count just changed!</p>]]></content:encoded></item><item><title><![CDATA[Good Programmers Can Be Anyone, But Not Everyone]]></title><description><![CDATA[<p>In my four decades as a programmer, I&apos;ve worked with hundreds of programmers, and I can say that no single type of person is good at programming.</p><p>I&apos;ve seen young people who could do amazing work and those without a clue. I&apos;ve seen programmers</p>]]></description><link>https://thecodist.com/good-programmers-can-be-anyone/</link><guid isPermaLink="false">64edecea6b79e907fc90d0da</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Tue, 29 Aug 2023 15:26:11 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/08/anyone.png" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/08/anyone.png" alt="Good Programmers Can Be Anyone, But Not Everyone"><p>In my four decades as a programmer, I&apos;ve worked with hundreds of programmers, and I can say that no single type of person is good at programming.</p><p>I&apos;ve seen young people who could do amazing work and those without a clue. I&apos;ve seen programmers older than me who could accomplish anything, and those with as much experience as I had had be unable to do even minimally acceptable work. I&apos;ve seen someone with a Ph.D. in Computer Science who built the single worst piece of code I&apos;ve ever seen, and people with no degree at all build complex applications. No race, ethnic group, background, experience, or other category ever made the work quality predictable, good or bad.</p><p>Programming ability appears to be related to nothing obvious except something in people&apos;s brains that connects with programming. It&apos;s not intelligence (a poorly defined concept with almost as many definitions as people making them) but some combination of qualities that make writing good code easier. Of course, even defining &quot;good code&quot; has almost as many definers as intelligence.</p><p>Writing computer code that works correctly in all circumstances, succeeds in implementing whatever is required, interacts reliably with all other code and systems it touches, and can be supported and modified over time can be one of the world&apos;s most difficult tasks. Combining these basics with understanding the necessary security and interacting with team members, other teams, executives, product and UI designers, architects, and even the outside world can make the required effort almost impossible. Yet some people do, and others do not.</p><p>Of course, many programming tasks are rote and require little skill, assuming adequate supervision and testing are available. Not everyone needs to be able to work on complex systems and walk a narrow tightrope between success and disaster. A single programmer working on a team of hundreds will not likely affect the project. Still, a single programmer working on a project alone may determine success or failure.</p><p>When I started as a programmer in 1981, the world had few programmers compared to today; I expect that many whole countries did not have a single one. Today, I have seen estimates of around 25-28 million worldwide. Are all of them good? Given the wide range of quality I have seen over my life, I am sure most of them are not.</p><p>Yet, given my experience working with or interacting with programmers over this long period, no one mold produced excellent or bad programmers. I remember when Mark Zuckerberg once claimed that only people under thirty were any good, saying, &quot;Young people are just smarter.&quot; That&apos;s pretty ignorant; I have seen people of every age write amazing code and yet also seen people of every age write absolute garbage. </p><p>Someone with a Ph.D. in Computer Science wrote the worst single piece of code I ever saw. They wrote a Mac application as a single .c source file, 29,000 lines long (the IDE at the time could not handle typing at the end) with a main event loop (original MacOS, in 1995), 14,000 lines long, indented two-and-a-half monitor widths deep. It took three of us several months to untangle it. </p><p>I once worked with someone who passed every Java certification and had a degree in Computer Science, yet they could not build anything that worked when put on a simple application. Having a degree or not even going to college did not seem to matter; I&apos;ve seen both succeed and fail. Age (and experience) also led to no predictable outcome. One person hired at the same time as I was, with the same number of years of experience I had, everything they wrote had to be checked as they did not seem to understand what they were doing (such as excessive copy-paste with extra lines).</p><p>I started my career as a programmer in 1981 with no work experience, and the only computer education I had was in 1973 in high school. Noodling around on my Apple II+ that I bought in 1979 was all I could offer, yet I passed a single interview with the hiring manager, who offered me a job. Everyone else hired at the same time had a B.S. in Computer Science. Today, someone like this would not even get a phone screen, much less a job. No one (including me) could have predicted that not only would I continue to be a programmer for the next four decades but that I would remain relevant during that entire time. Why? I have no clue. I have the qualities that make a good programmer, but I don&apos;t know what they are or where they come from. Others may be smarter, more capable, and accomplish more than I can, but from my experience, many more cannot do anything beyond simple tasks or wind up with broken code. </p><p>Programming well requires discipline, imagination, creativity, the ability to comprehend what is being asked for, focus on details, comprehension of existing code and side-effects, the ability to plan for the future without compromising the present, being willing to learn things that may or may not be relevant, and the ability to spend the time necessary to understand everything even if it&apos;s tedious. It&apos;s a long list of abilities, but you can&apos;t predict who will be able to do well and who will not.</p><p>When I started my first job in 1981, I did not know if I could do anything as a professional programmer. I did not even know what a programming job entailed besides the apparent writing of code (the joys of no internet back then to learn what the job was like). Somehow, I found I could learn on the fly and improve my ability as fast as necessary. I was handed complex tasks where I was the only programmer and could get them done. I still find it hard to believe I spent four decades writing code (and still write code today for my generative art), given I had no real education in it. I didn&apos;t even expect to work more than a couple of years as a programmer before continuing to get a Ph.D. in Chemistry.</p><p>I firmly believe latent programming ability exists everywhere, but opportunities to discover it do not exist equally. To me, the most significant opportunity is to expose students at a high school level to programming so that people who do have the ability to discover for themselves a desire to program. I expect most people will find it tedious, boring, complex, or otherwise unappealing, and that&apos;s fine. My exposure to programming 50 years ago in a public high school, which had to be one of the very few offering this in the country, eventually led me to lifelong enjoyment and success as a programmer. Without that class, it is doubtful that I would have ever started.</p><p>Good programmers can come from anywhere, even countries that seem unlikely to produce them, because the features in the human brain that are necessary for success can be anywhere. It would be best to allow these people to be discovered and nurtured. Missing people who could be good programmers is a loss for everyone.</p><p>Mediocre programmers can also come from anywhere. I do not believe, from my lifelong experience, that everyone will do well if you just train them enough and invent some environment or process that will keep them from failure. It may be OK to have only minimally capable programmers who work with lots of supervision, but expecting all programmers to exceed that is unreasonable. I always preferred having a small team with people I could expect to succeed without excessive oversight rather than a large team who could not be that. You don&apos;t always have that choice, however.</p><p>&quot;Anyone Can Cook&quot; is a nice slogan for a Pixar movie (Ratatouille), but it isn&apos;t reality any more than &quot;Anyone Can Program.&quot; Still, believing that a rat can be a cook is a decent metaphor for thinking that good programmers can come from anywhere if you allow them to discover it for themselves and don&apos;t have a preconceived notion of what they look like. My core team (both I hired) at my last job was a former Army tank driver who taught themselves to program and someone whose LinkedIn photo featured a pink mohawk. I&apos;d build anything with them.</p>]]></content:encoded></item><item><title><![CDATA[Looks Good To Me: When Code Reviews Go Awry]]></title><description><![CDATA[<p>Code reviews can effectively improve code quality in large or mixed teams with experience differences. They can also be useless if not done correctly or if management does not support the time to do them.</p><p>A code review is a modern invention, as the technology to do them easily did</p>]]></description><link>https://thecodist.com/looks-good-to-me-when-code-reviews-go-awry/</link><guid isPermaLink="false">64e4ecad6b79e907fc90ce03</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Fri, 25 Aug 2023 14:34:43 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/08/looks.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/08/looks.jpg" alt="Looks Good To Me: When Code Reviews Go Awry"><p>Code reviews can effectively improve code quality in large or mixed teams with experience differences. They can also be useless if not done correctly or if management does not support the time to do them.</p><p>A code review is a modern invention, as the technology to do them easily did not exist for a large part of my career. I did not even have access to a code repository for the first 14 years (the first one I saw was in 1995); although I knew about them, we had no access to one when we built Mac applications in my two startups from 1985 to 1994. Even after this point, I did not see much use of this technique until my last job (2015-2021). Of course, code reviewing as a part of a process likely would have been used elsewhere, just not where I worked.</p><p>To do a code review properly, you have to look at three things: (1) what does the code affected currently do, (2) is the change being made appropriate to what was asked for, and (3) what effect does the change have? Doing this may require a large percentage of the time taken by the person who made the change in the PR, depending on how familiar the reviewer is with the code and how complex the change is. Trivial changes may take little time, and complex implementations such as new features, API additions, or large refactorings may be time-consuming.</p><p>Note that people, including me, shipped large complex applications for decades before anyone could continuously review changes. My team built Trapeze (a spreadsheet-like program released in 1987), assisted on Persuasion (the only real competitor Powerpoint ever had, released in 1988, we contributed only to version 1.0), and Deltagraph (for the first five years only, a charting and graphing application, released in 1990 and existing until the pandemic finally killed the last owner in 2019 or so). All three applications were large and complex and functioned well.</p><p>In contrast, teams are far larger and much more diverse today, with people spread out over geographic areas, operating in a far more complex programming world, interacting with multiple teams, and having many different levels of experience. You can still be successful without code reviews but in far more limited circumstances. Teams in the 80s and 90s tended to be very small due to the inability to manage codebases with more than a handful of people without robust code repositories. My teams for that decade never had more than 4-5 programmers, and we had to be very disciplined to manage our code in a shared folder! We also had to share the code with the primary author of Persuasion, and in the last version of Deltagraph, we shipped the code on hard drives via FedEx. We didn&apos;t even have company email until 1991, so only phones could be used to discuss anything remotely. Yet it worked fine because the teams were small and careful.</p><p>The first large team I saw was at Apple when they were trying to build a new OS. Around 500 people were contributing to Copland, and it was a complete disaster. People checked in code without any oversight or control, and the project was a nightmare. The tooling to organize such a massive team did not exist, nor did any idea of how to coordinate them all. It&apos;s a good thing it was canceled (after I left Developer Support and gave up on Apple), bringing back Steve, and everything changed.</p><p>Code reviews, if done correctly, can make teams better. Done poorly or not done at all, they can lead teams into disaster, as everyone assumes the code has been reviewed and lessons learned, yet nothing of the sort happened.</p><p>My last employer was an extremely large non-tech company with many divisions; I worked for a high-profile group in one of the largest. We owned the largest vertical piece in our two apps, but many teams contributed to them, including the largest team that managed the mobile app container. Management, particularly project management, insisted that programmers do 40 hours a week of sprint-related work (which is pretty silly). Yet, some teams insisted that everyone also do code reviews on top of that. To do this, you either had to work overtime (or if a contractor is forbidden to work overtime without permission) or make the problem disappear.</p><p>Thus enters &quot;Looks Good To Me&quot; or LGTM.</p><p>I did not make my team do any required code reviews; however, I also picked everyone on my (small, 3-5) team so I had no inexperienced programmers. Despite being the lead, I was also a full-time programmer on my team, and we all were highly conscious of how we did everything, including asking for reviews if there was any doubt, harkening back to my earlier experience. What we built had extremely high visibility in the company and with our customers, yet we were constantly affected by budgetary constraints; it was either picking a really good small team or getting stuck with an army of cheap contractors. For my team, what we did worked well, and we could do more with fewer people with high-quality results, but this is not appropriate for most situations.</p><p>Other teams were much larger, and a mix of programmers with varying experiences, plus many random contractors that came and went (a codebase I inherited was an average of eight programmers over nine months, yet sixty total programmers contributed but left). Due to the large amount of work required and the insistence on spending each programmer&apos;s time writing code, code reviewing was reduced to trivialities such as the ubiquitous LGTM. I was mainly asked to review changes that affected my team&apos;s responsibilities. The core team would require changes we requested in their codebase to be reviewed, which then would be criticized with pointless comments (like, you need a space before a semi-colon) instead of anything important. Sometimes, we wondered if it was political (company politics can be so much fun).</p><p>I occasionally looked at code reviews for other team&apos;s changes that eventually became a problem, often crashes that riled up customers and management once they were fixed, to see what happened. </p><p>The most egregious example was support that had been added to the iOS app for a set of schedules pushed to the app several times a day. It was &quot;reviewed&quot; by that team and had a positive unit test for code that converted a JSON service response into model objects. The service changed, but no one paid attention; it was never tested in stage at all, and the day it was released to the app store, the crash rate of the app spiked to 50% several times a day! Alarm bells went off everywhere, and all team leads were required to figure out what was wrong; this was made more difficult because the symbol file was never uploaded to the crash reporting service, so we had no idea what was wrong. Once we figured out what the issue was, looking at the PR said it all: LGTM strikes again! The problem was that the programmer (a contractor) had written code that, if the service response did not match expectations, called FatalError(), the one method on iOS that will kill your application. This appeared five times in the PR, and no one had looked at it. The unit test was only positive; a negative test (if the service response was wrong) might have caught the problem. When confronted with the FatalError() debacle, the programmer said, &quot;Well, it would have crashed somewhere else.&quot; Sigh.</p><p>If a company expects programmers to do code reviews, then it needs to hire more programmers. If each person is doing ten hours properly reviewing the code of others, you need a third more people if you want the same amount of development time. This investment is not an issue for some companies, but not everyone is willing to spend the money. Another consideration is who reviews the code (experience matters); the quality of who writes the code also makes a significant difference, as the likelihood of poorly written code increases. Other factors also include the size of PRs (I&apos;ve been asked to review changes made by other teams that changed 500 source files, no thank you!), the complexity of changes (I&apos;ve seen PRs with only a few lines changed that nonetheless would take hours to understand), and unfamiliarly with the codebase, business logic, and algorithms. Many programmers also dislike reviewing code and prefer to work on their own code, and management in some companies may think reviewing code is a waste of time and money. Increasingly larger teams make reviewing necessary, yet may also become ponderous and time-consuming. My small handpicked team was an uncommon option.</p><p>Today, code reviewing is well supported by technology (maybe with AI, it can be made easier, but I am not sure if I would trust it at this point), and given the complex nature of programming today, it is more often than not a benefit, but only if it is done correctly and with sufficient time and investment to accomplish that. Doing it poorly or suffering from &quot;Looks Good To Me&quot; syndrome may be worse than doing no reviews, as people will assume everything is fine when it is not.</p>]]></content:encoded></item><item><title><![CDATA[Learn Something New Every Day]]></title><description><![CDATA[<p>You can&apos;t stay relevant for over 40 years without learning new things.</p><p>In my first job in the early 80s, learning new things was a fundamental requirement to being a programmer&#x2014;almost everything you did was new, both to you and often to everyone else. I started</p>]]></description><link>https://thecodist.com/learn-something-new-every-day/</link><guid isPermaLink="false">64cbb7e93fe2ca1932d7c4fe</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Thu, 03 Aug 2023 16:22:56 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/08/read.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/08/read.jpg" alt="Learn Something New Every Day"><p>You can&apos;t stay relevant for over 40 years without learning new things.</p><p>In my first job in the early 80s, learning new things was a fundamental requirement to being a programmer&#x2014;almost everything you did was new, both to you and often to everyone else. I started a habit of reading about new things every morning before I started doing whatever my job was, and I still do it today despite being retired from programming for a living (I make art now with it).</p><p>Knowing what is happening in the world of programming, what is new, what is fading, what is speculative&#x2014;all these things give you an advantage over people who simply grind in their job every day. Programming for a living is hard, as everything changes constantly, and what was hot last year is often dead today. It&apos;s easy to get left behind, and while that isn&apos;t always a bad thing (someone has to maintain old code that is still necessary), it can leave you potentially obsolete.</p><p>Of all the people I still know who started programming in the 1980s, I am the only one still doing leading-edge programming at the end of my career (in my case, iOS development in Swift). Many people who learned nothing new discovered they could no longer work as programmers, entered management or even left the industry completely. New is so common in this industry. Ignoring it is not an option.</p><p>In the 80s, there weren&apos;t many easy ways to learn what happened since this was before the Internet, blogs, open source, computer books, etc. There were a few magazines and scientific journals you could read, but that was about it. Often you didn&apos;t even know anyone else doing what you were doing. Most of my friends did not own a computer and had little idea what one could do. So I had to be creative to learn something new.</p><p>I read everything I could, manuals, the PC catalog from IBM (helpful since I knew it better than the local salespeople, who called me for advice), basically anything interesting. I bought a C compiler at work despite working in Pascal and Assembly Languages because it seemed like the future. In the 90s, with the rise of the Internet and blogs, and the rush of computer books going into print, I started a regular reading session every morning. No day would start without some discovery, even if only a small one.</p><p>Over time, I began to see changes happening in the programming world when they were first visible and would experiment with anything that could be important for the future. In 1985 I read Byte magazine, which featured object-oriented programming (primarily Smalltalk). In 1988 I was able to add object extensions to C that I used to solve tricky problems in building Deltagraph. In 1999 I tried to convince my employer to switch from WebObjects to Java/J2EE by presenting to the entire company (I called it Alien Technology for some reason). Then I worked on our first project in Java for a customer. This happened because I read and played with Java on my own time. I was the first person I knew who tried unit testing and built several apps as Javascript single-page applications (and got yelled at by others who thought it was something I bought without authorization).</p><p>All of these things don&apos;t make me a genius; just someone who reads every day and tries out something that looks useful. Sure, you could wait until your employer decides what is important and just do your job until then. I&apos;ve seen way too many people fall into that trap and discover their employer no longer needed them, nor any other. It&apos;s part of what your career as a programmer is about, always moving along the crest of the wave. If you make it a daily event to learn something new, you see what is happening over time, and seeing something worth studying more on becomes easier. Just looking every once in a while, or waiting for someone to tell you to do something, means you are unlikely to be able to distinguish real change from noise. </p><p>Previously in this blog, I always used the metaphor of a steam roller (the big machine with a wheel that flattens pavement as it is laid) driving directly behind your head&#x2014;if you stop and decide you know everything you need to know, it runs you over. Don&apos;t get flattened; stay awake and keep your eyes on the future.</p>]]></content:encoded></item><item><title><![CDATA[Why Are People Still Using C?]]></title><description><![CDATA[<p>C is second in the latest TIOBE list of the most popular languages. I find it hard to understand why, unless there is a lot of existing code to support, I can&apos;t fathom why anyone would start something new in 2023 in C.</p><p>I first learned C in</p>]]></description><link>https://thecodist.com/why-are-people-still-using-c/</link><guid isPermaLink="false">64c99eb83fe2ca1932d7c30e</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Wed, 02 Aug 2023 01:16:13 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/08/c.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/08/c.jpg" alt="Why Are People Still Using C?"><p>C is second in the latest TIOBE list of the most popular languages. I find it hard to understand why, unless there is a lot of existing code to support, I can&apos;t fathom why anyone would start something new in 2023 in C.</p><p>I first learned C in 1984, when I bought a C compiler at my first job because I thought it was about to become important to know. At the time, I was working in Pascal and various assembly languages.</p><p>The two little companies I started in 1985 and late 1987; both built everything in C. We built a spreadsheet-like program (Trapeze), worked on part of a Powerpoint competitor (Persuasion), and then built a charting application (Deltagraph), all in C. For Deltagraph, I added some object-oriented extensions to make life easier&#x2014;there was no C++ yet. I didn&apos;t get to use C++ until 1994/95.</p><p>In 1996 I wrote a commercial memory allocator library (HeapManager) for the Mac, supporting Metrowerks Codewarrior. It supported C and C++ applications and was written in C++ instead of C. The library was the fastest one you could get on a Mac and included a detailed debugging library. I still have the source code somewhere.</p><p>In 1997 I worked for a few years at a consulting firm that used WebObjects, which required working in Objective-C, which is C with funky object extensions (much nicer than what I had done earlier!). However, for anything unrelated to the OO features, it was still plain C. By 2000, I pushed us to switch to Java since it appeared important for the future, and WebObjects was declining in popularity. For the next decade, I just worked in Java.</p><p>Working for a game company at the end of the decade, I found myself working in C and C++ (and Lua and Javascript) again, as the game engine we had was old and was built by people angry with each other originally&#x2014;the engine was a terrible mix of technologies. I remember finding three different implementations of linked lists, which over time, had cross-pollinated!</p><p>In the next decade, I built iOS applications, which meant Objective-C again for a few years. Finally, in my last job, everything my team and I wrote was in Swift though we did have a large codebase I inherited that was in Objective-C until it was completely rewritten in Swift.</p><p>So I have spent most of my programming career in C and derivatives. Today I can&apos;t find any reason to care for C anymore. It was a fine language forty years ago; while Pascal seemed like a friendly language, it wasn&apos;t speedy, and given that computers of the day were pretty sluggish, C seemed like a better fit. Today, however, innumerable programming language choices are far better for all the many things people want to write. </p><p>Linux was always written in C, but Rust is beginning to replace parts of it even there. Go seems like an excellent language in the same vein as C but far more modern. Of course, Swift (and its cousin, Kotlin), are ideal for everything I needed to do in mobile development. Being retired now, I make digital generative art in Swift. There are so many language choices today that no one can even remember what they are called. When I started in 1981, I had a choice of one: Fortran.</p><p>C requires that you carefully manage your own memory and keep in mind the security issues of everything you do lest you leave a hole someone can exploit. In the 1980s, it wasn&apos;t a big deal, but secure coding has been essential since the dawn of the Internet, and the C language doesn&apos;t help you much. Before ARC (Automatic Reference Counting), writing Objective-C was easy to screw up since you had to manually balance Retain and Release calls on top of regular malloc/free. After ARC (which still runs under Swift), it became much easier as the compiler took care of most of the effort, though you still had to manage regular C memory. </p><p>People deal with memory and secure coding issues by running various analyzers (HeapManager helped a lot though it was single-threaded since MacOS at the time did not support real multi-threading), yet every week, you read of newly discovered security vulnerabilities. </p><p> I prefer a language that will make errors harder to make. Swift works for me since I can build more reliable and less randomly wrong code using language features to enforce quality. Writing in C and Objective-C wasn&apos;t terribly difficult, but it required much effort to ensure we couldn&apos;t screw it up. When we started writing Trapeze, the first C compiler we used in 1985 did not even support function prototypes! That required incredible discipline to ensure all function calls had the correct parameters. Having a language that keeps you from messing up is a blessing.</p><p>With Go, Python, Swift, Kotlin, Java, C#, the Lisp multitudes, PhP (if you must), Javascript (sigh), C++, and uncountable others, many languages are less primitive than C. Sometimes you have no choice since you are supporting some existing codebase and no one wants to rewrite it (that&apos;s how you wind up with 60-year-old COBOL). Deltagraph was written in C, but my team and I only worked on it for the first five years, after which we wanted to rewrite it in C++, but the publisher wasn&apos;t interested and soon sold it so they could go into a different business. The final owners of Deltagraph continued to sell and support it until the pandemic apparently put them out of business. The codebase I started in August of 1988 lived for thirty years&#x2014;I can&apos;t imagine how terrible it looked after twenty-five years of maintenance after we stopped working on it! They never could make it run in 64-bit since, in 1998, that did not exist. Deltagraph remained a viable charting application after dominating for a while in the 1990s, but eventually, the inability to modify it made it unprofitable. Maybe in C++, it would have had an easier life in the future.</p><p>I know some colleges still teach C to all Computer-Science students since it is relatively easy to learn. I know a student who wants to learn web development, but their school is teaching everything in C, which is not likely to help much. Exposing students to multiple languages has to be the right approach. C by itself is neither functional nor object-oriented&#x2014;it might be OK as an introduction to the idea of programming, but not beyond that.</p><p>I doubt C will ever vanish, just like COBOL won&apos;t. I remember languages that did disappear (does anyone remember PL/1, Jovial, or SNOBOL?), but C won&apos;t ever die like they did. Still, if you have a choice, pick something that makes your life easier, is well-suited to the task, and hopefully has a nice future.</p><blockquote>free(article);</blockquote>]]></content:encoded></item><item><title><![CDATA[I Am Not Betty, And I Can't Do Anything About It]]></title><description><![CDATA[<p>At some point around 2016, a person named Betty, in the town I used to live in, gave my phone number to someone (either by accident or a random number), and it became associated with her name and address.</p><p>Her home sits atop a giant gas field, and she gets</p>]]></description><link>https://thecodist.com/i-am-not-betty-and-i-cant-do-anything-about-it/</link><guid isPermaLink="false">64c67cd83fe2ca1932d7c1f8</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Sun, 30 Jul 2023 16:26:42 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/07/Betty.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/07/Betty.jpg" alt="I Am Not Betty, And I Can&apos;t Do Anything About It"><p>At some point around 2016, a person named Betty, in the town I used to live in, gave my phone number to someone (either by accident or a random number), and it became associated with her name and address.</p><p>Her home sits atop a giant gas field, and she gets a nice royalty check. Why do I know this? Because since then, I have received over a thousand texts (I gave up counting) and innumerable phone calls like &quot;Betty, have you considered selling your house?&quot; I know where the house is, and it is nothing special; the only reason to buy it is for the checks.</p><p>Most of the calls were from different numbers, though sometimes the callers became desperate and called or texted repeatedly, often pleading for a response. Even on the rare occasion that I replied with &quot;wrong number,&quot; they would respond by asking if I had anything else to sell. No one would ever reply with where they had received the information from. </p><p>A couple of years later, I began to get continuous calls and texts from politicians from both US parties. Most were far from where Betty lives; all were addressed to her, although a few were addressed to her spouse. Each one begged for a donation to keep some boogeyman from the opposite party away. Sometimes I would get multiple requests from people running against each other, and the most obnoxious ones included some pictures to emphasize their point.</p><p>There is absolutely nothing I can do about this except to replace my phone number, which is painful as I can&apos;t even remember all the places that now have it (at least the ones I care about), and altering what amounts to part of your identity is difficult.</p><p>I believe the original record connecting my number to Betty&apos;s identity was available to others, maybe on some public or internet database or sold by a data broker as a &quot;good&quot; phone number. Over time, like a rampaging virus, it multiplied into a myriad of databases, all easily available to anyone who desired to talk to Betty. I know the address, but even if I wrote Betty a letter, I doubt she can correct the problem now.</p><p>Our modern, highly interconnected society has some issues when information that is not true becomes part of reality. In this case, it is mostly annoying since only my phone number has been hijacked and mixed up with Betty&apos;s address and name. This is not exactly identity theft but a blending of identities that provides zero benefits to anyone. I get irritating texts and calls, and the wielders of this information get nothing.</p><p>I imagine Betty sitting on her porch, collecting her gas royalties, enjoying a peaceful time without anyone contacting her. Meanwhile, an army of people is communicating with me, desperate to get a response and wondering what kind of person fails to respond to such energetic requests.</p><p>Sometimes I wonder if I could set up an AI Bot to automatically reply to each request with something pithy and hope that is enough to stop them forever. Still, many of these folks have no connection to any other, so that is unlikely to help, even if it might provide some fun. In the end, there is no point in trying. I should give in, get a new number, and leave all those admirers of Betty to wallow in disappointment.</p><p>I am not Betty.</p>]]></content:encoded></item><item><title><![CDATA[My Compensation Over Four Decades As A Programmer]]></title><description><![CDATA[<p>It&apos;s a lie, as I don&apos;t remember exactly what I made in every job, and it changed during each job. Also, I never worked for a big tech company (other than Apple briefly, but that was when they were going out of business) and only a</p>]]></description><link>https://thecodist.com/my-compensation-over-four-decades-as-a-programmer/</link><guid isPermaLink="false">64a48e013fe2ca1932d7bfc7</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Wed, 05 Jul 2023 13:56:50 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/07/money3.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/07/money3.jpg" alt="My Compensation Over Four Decades As A Programmer"><p>It&apos;s a lie, as I don&apos;t remember exactly what I made in every job, and it changed during each job. Also, I never worked for a big tech company (other than Apple briefly, but that was when they were going out of business) and only a year in a tech hotspot (Apple), so I never made &quot;Google&quot; money. I can recall enough to give you a general idea.</p><p><a href="https://thecodist.com/what-programming-was-like-in-1981-at-my-first-job/">At my first professional job as a programmer in 1981</a>, for a defense contractor, I was hired along with several recent graduates who all had a B.S. in Computer Science. Each was paid $20,000 a year (about $64,000 in today&apos;s dollars); however, I got $2,000 more as I had graduate school experience, despite only having a B.S. in Chemistry. It seems ridiculously low compared to what interns make at Google today, but back then, there were far fewer jobs and programmers; that&apos;s still not all that unusual pay worldwide today.</p><p>After that job, I started and ran two startups; the <a href="https://thecodist.com/what_writing_and_selling_software_was_like_in_the_80_39_s/">first sold our product Trapeze</a> (sadly competing with Excel, we gave up and sold it to someone else), and then a second in 1998 that built Mac apps for others under contract (hourly). Our longest contract was to <a href="https://thecodist.com/the_story_of_deltagraph/">build Deltagraph</a>, multiple versions over five years; it was very popular worldwide, so we did well. 1992 was my best year for income in today&apos;s dollars (the equivalent of more than $200,000). Eventually, the publisher decided to do something new, and given the Mac market was dead, my little company split up as getting new customers was impossible.</p><p>I went to the Bay area in 1995-1996 and worked three contracts, the longest being at Apple. I made $40-$60 an hour in the contracts but left the area because Apple seemed to be going out of business, and I didn&apos;t want to be there when it did; <a href="https://thecodist.com/my-biggest-regret-as-a-programmer/">a giant mistake</a> as a year later, Steve came back, and everything changed.</p><p>In 1997 after some contract work, I got a job at a consulting company, where I worked for 4.5 years; we mostly did WebObjects/EnterpriseObjects and later Java. I don&apos;t remember what I made there; it changed significantly over time. I remember that we went out of business at 4:30 in the afternoon on a Thursday a few months after the Dot-Com collapse. My employer had been unable to pay rent for seven months and had allowed our biggest customer to not pay us for more than a year. After that, it was tough to find another job; the part of the country I lived in was not small, but not all that great a market for programmers.</p><p>I eventually found another consulting job, where I first worked for three months in Mexico; we were the U.S. branch of an international consulting company. My pay, I don&apos;t recall; I do remember I had to pay for all my trips to Mexico and was only reimbursed a couple of months later, which sucked. What sucked even more was that our parent company eventually closed all the U.S. offices, and we all got laid off.</p><p>This time I found a job for a local Financial Services company as an architect. We had around 60,000 customers and a bank. I know I made less than $100,000, but I&apos;m unsure exactly what it was. It was a frustrating place to work, but I did get to make a difference because so many things were broken that I could fix or improve. This company did not value programmers much; all of our work was internal or to support our customers&apos; needs, but half the executives wanted to buy everything and do away with I.T. entirely.</p><p>After about 18 months, I found a local printing company, post-startup, with a nice market niche, and it was an all-Mac company. They paid me $100,000 finally, and I was only one of two programmers. Sadly it turned into a hostile workplace; plus, they also offered no vacation or sick days until you were there for six months. In retrospect, I ignored all the warning signs. I quit after five months as it was frustrating to go to work.</p><p>My next position was for a healthcare company, a recent merger of two companies from two states, as an architect again. Not sure what the pay was exactly, but as I worked there, it became obvious that one half of the merger was trying to kill the other half, and I worked on the losing side. Eventually, I was laid off since I worked at the loser.</p><p>After all these sucky results, I just wanted to work somewhere I would be appreciated, so I went to work at a local game company with an MMO war game. I was a player since it started and knew the Mac programmer was leaving, so I took his place. They had little money; all they could pay was $36,000 a year, which was terrible. The work, however, was so rewarding, as I was appreciated by the company and especially the player base. Often in the game, I would get D.M.s thanking me for <a href="https://thecodist.com/fixing_a_nasty_physically_modeled_engine_bug_in_an_fps_game/">fixing</a> so many things. It was <a href="https://thecodist.com/fighting_the_good_war_against_mmo_cheaters/">fun</a>, but eventually, the pay was too little, and I had to leave. The game still exists and is in a much better position today.</p><p>My next job was at a local well-known Online Travel Agency, a brand that still exists today, working in their mobile group, building iOS apps. We had only 20 people out of nearly 600 in I.T. but generated most of the profit. It was a fun job until our parent company decided to sell the brand to our biggest competitor, which still puts it out as if it were a separate company. Argh, another layoff, in this case, 100%. At least they paid me to stay for a year to assist in the transition. Despite having zero work or responsibilities for almost five months, I got paid over $100,000 plus the retainer to stay (another $30,000 or so). Nice gig, money for nothing!</p><p>I was offered a position for what would become my last job. Still, the budget was not forthcoming for almost 18 months, so I found a contract position in the meantime as the only iOS programmer, working with a customer I had done work for in that first consulting job. It was an hourly contract, but the pay was decent, although I can&apos;t remember it anymore. For the last few months, I only worked sporadically, as I had completed everything in the iOS app they wanted while the remaining team still had work to do. Eventually, my final job came through, and I moved halfway across the country.</p><p>That final job was made possible by my manager at the OTA, now working for this company, who wanted me specifically, as the company was finally moving mobile development to vertical organizations instead of having a central group do everything. This company was extremely large and well-known, and this division used and built a lot of technology to support the real-world business of our division but was not considered a tech company. It was my largest employer; pay for where we were was good, but nothing compared to a Meta, Alphabet, or Apple. When I retired, I was earning about $180,000, including bonuses, although calculating the total compensation is hard due to 401k contributions, etc. I think interns at Google make more (although I am unsure if they still do internships in the work-from-home era).</p><p>Nonetheless, I was always fine with what I got paid, though, for the first time, I worked a lot more than 40 hours a week. I owned a nice house on a half-acre lot with a pool in a quiet neighborhood within the (large) city limits. Good luck finding that in Silicon Valley!</p><p>Could I have made more in my life? I could have been hired at Google, Facebook, Amazon, etc., and made up to 3X more per year. I never considered money to be my only interest; I always preferred working on small teams where I could make a difference. I would rather be on a team of 1-5 than be on a team of hundreds or even thousands, no matter my position. I always preferred to work locally rather than constantly move around; my year in Silicon Valley in the mid-90s was an outlier, and my final move to where I am now was the only real move. Your income depends greatly on where you live and the cost of living. If your living expenses are 3X and your salary is 3X, it&apos;s not all that different. I made enough, and I don&apos;t regret it.</p>]]></content:encoded></item><item><title><![CDATA[The Unreasonable Ineffectiveness of Estimates]]></title><description><![CDATA[<p>In my long career, I&apos;ve dealt with many different kinds of estimating, from the early days in the 1980s when there was no estimating because no one had any idea how to do it to my last job where estimation was always demanded but never actually relevant.</p><p>In</p>]]></description><link>https://thecodist.com/the-unreasonable-ineffectiveness-of-estimates/</link><guid isPermaLink="false">644fb0983fe2ca1932d7b51c</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Thu, 22 Jun 2023 16:35:35 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/06/est.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/06/est.jpg" alt="The Unreasonable Ineffectiveness of Estimates"><p>In my long career, I&apos;ve dealt with many different kinds of estimating, from the early days in the 1980s when there was no estimating because no one had any idea how to do it to my last job where estimation was always demanded but never actually relevant.</p><p>In the following posts on estimation, I will highlight two projects (both with 8-figure budgets), one that eventually shipped and one that was canceled at the last minute&#x2014;from my previous job. The details will remain anonymous given my former employer, although something else has replaced both projects.</p><p>Estimation is generally being asked to provide a possible amount of time and resources to complete some small or large task. For example, small jobs might be to make a minor change to existing software; large tasks might encompass projects that will consume months or even years or utilize hundreds of people. Generally, those who want the job completed, are providing funds for it, or have some business need to fulfill, will expect an idea of when it might be completed or how much it will cost. The problem, of course, is in the details provided.</p><p>Software that needs to be built or modified comes in many different forms. For example, you might be building software under contract from a customer, building software products for your employer to sell, building software for your employer to support their real business (my stories cover this), building software for your employer that is the business (possibly supported by ads or similar), or creating software for internal use only. Each of these will require estimation for different reasons.</p><p>In my experience, these situations must exist for any estimation to be meaningful. First, the timeframe for the estimation is also an additional consideration&#x2014;two-week sprints are much easier than a 16-month project.</p><p>While the following are essential, I should add that I have never seen all of these at once! </p><p>(1) The team has to know the subject area well. The more experience you have building software, knowing about the business area, or understanding the technology involved, the easier it is to base an estimate on what happened before. For example, if you are building a General Ledger app for the 8th time, you probably know what to expect. In addition, experienced programming teams are more likely to realize what can go wrong and how to approach the project, along with understanding what your company does, than contractors recently hired. </p><p>(2) The requirements are clear and relatively complete, and whoever is presenting them is available and able to explain them. No matter how they are written, these form the basis of what you are expected to do and thus should let you see enough to build a decent estimate.</p><p>(3) Only minimum changes will be expected during the estimated project timeframe. This would seem to logically follow (2), but anything you don&apos;t know, you can&apos;t estimate without fudge factors or guesswork. If you expect that there will be many changes, then you will have to add or multiply time, implying your estimate is not very meaningful.</p><p>(4) The project timeline should be based on estimates instead of requiring the estimates to fit a pre-determined timeline. It&apos;s incredible how often that happens. Of course, in Scrum, the sprint is a fixed timeline, but that&apos;s OK if you decently pick things to work on that can be completed in the sprint. I generally have seen this issue in project-long estimates. Still, sometimes management decides you need to move faster and forces you to cram more things in, making any estimation at the beginning difficult or pointless.</p><p>(5) Another complicating factor for estimates is accounting for dependencies, other teams building parts you need, services, or even hardware. How long it will take to build your piece may be difficult to estimate since it depends on teams that may have poorer estimates or depend on yet more teams. So the more your estimate depends only on what you control, the more likely the estimate will be meaningful.</p><p> All of this seems obvious, yet it rarely happens so easily. The stories I will tell in the following posts show a lack of all of them.</p><p>The usual story regarding the need for estimations is that management needs to know if the project is worth doing, is affordable, or can be completed in an acceptable timeline. Unfortunately, in my experience, sometimes they want it no matter what you say, even changing your estimate to make it more palatable. But, of course, you have to try then to do the impossible! I have also seen where estimates were used to purposefully kill projects, both by programming teams and because of office politics. </p><p>In my first job in the 80s at a Defense Company, few did estimates as projects took a long time, and few had any experience building software, so estimates were not critical. At my first little software startup in 1985, we were making a product to sell based on my idea (Trapeze), and none of us had ever built a Mac app before. The Mac industry was so new we could not know how long it would take.</p><p>For my second company starting in 1988, we worked on Persuasion for its author (later published by Aldus) and then built Deltagraph (for the publisher) for multiple versions over five years. The development cycle in those days was at least 6-8 months, starting with a desired set of features to be built and adjusting as time passed. Most estimates were comparative, i.e., we have X time left, which of the Y features might be easiest to complete, and push everything else into the next release cycle. This type of estimation is somewhat more straightforward; we were fully aware of what we were building and what was on the list, and it was easy to adjust. No one demanded we estimate each feature individually.</p><p>Working for a consulting firm, I did a project for a travel company where the requirements were basically &quot;fix our refund process.&quot; I was expected to estimate how long it would take (which was a guess as it was not simply implementing something decided on but fixing a broken business process and providing a technical solution). I gave them an amount I thought reasonable. Mid-way through implementation, the business suddenly made a major request which would require me to change at least a week&apos;s worth of existing effort. I informed them of this, but they refused to pay for it, arguing about it for the next month, and finally, my employer told me to stop working when the original time ran out. Sadly their existing programmers had to finish the work, and I was not allowed to speak to them at all, so I never found out if they finished it. Estimates based on vague and minimal ideas are not worth much; customers complaining about missed estimates in such a case are sadly not uncommon.</p><p>In my last job, services were often many layers deep, each a different team, often far removed from what we were tasked with. Sometimes teams we needed something from might even have been third-party vendors who usually could care less. In one case, such a vendor was tasked with adding something to their extensive system that was not their normal business (chosen for God knows what reason) and only communicated via a single WSDL controlling an enormous XML document, most of which was unnecessary for our use case. We had to build an intermediate server to convert this XML to JSON for our service layer to consume. Their management and development teams were on separate sides of the Earth, making communication difficult. In addition, deployments of their system often caused multiple days where it didn&apos;t work. Trying to estimate anything here was pointless, as their contribution was basically random time.</p><p>During both projects, executives demanded constant reports on when work would be completed, often in the form of a burndown chart showing when the end of all sprints would be reached (yes, we were required to enter how many Story points each sprint would take, at the start of the project). Continuous changes (some formal, some simply demands) made the charts look like sawtooth waves. Since the end date changed continuously, many execs complained we were not getting enough completed while simultaneously adding changes. One PM I knew said he had built several dozen custom reports using the data in our &quot;Agile&quot; tracking system despite knowing they were all rather pointless. Estimating an entire project upfront on a sprint-by-sprint basis, knowing constant (often daily) changes are coming, is about as pointless as it gets.</p><p>Eventually, one project shipped, and the other was completed but canceled, despite all of this. Were they late? Yes. No. How would you even determine it? My feeling over my entire career was that things take as long as needed, irrespective of what you initially thought.</p><p>After the two projects I will describe, our division tried something different to please executives who wanted to know when everything would be complete. They insisted all teams use a spreadsheet-style application to enter completion dates for major parts of their part of the project. The problem was that there was no way to model dependencies&#x2014;each team was expected to add its own dates. Naturally, it made no sense despite executives looking at the max date as when the project would be complete, which rapidly became further and further out. While some pieces could be done before dependencies were complete, no QA could be done until all dependent systems were sufficiently tested; thus, everything got later and later.</p><p>At the start of my career, no one knew enough to estimate anything, so we didn&apos;t, and things still were completed. At the end of my career, we were expected to make mostly pointless estimates that acted as placebos for anxious executives. Estimating things is not always necessary but sometimes valuable in reasonable circumstances. Perfect results are, however, highly unlikely.</p>]]></content:encoded></item><item><title><![CDATA[Puzzled Why Instagram Fails on Safari]]></title><description><![CDATA[<p>I wanted to look at Instagram to see if every art hashtag was still overwhelmed with terrible AI art, but today on Safari, all I get for every page is:</p><figure class="kg-card kg-image-card"><img src="https://thecodist.com/content/images/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png" class="kg-image" alt loading="lazy" width="1996" height="360" srcset="https://thecodist.com/content/images/size/w600/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 600w, https://thecodist.com/content/images/size/w1000/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 1000w, https://thecodist.com/content/images/size/w1600/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 1600w, https://thecodist.com/content/images/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 1996w" sizes="(min-width: 720px) 720px"></figure><p>It works on every other browser I have. But why?</p><p>In the console are two errors, found and placed there</p>]]></description><link>https://thecodist.com/puzzled-why-instagram-fails-on-safari/</link><guid isPermaLink="false">648de9d53fe2ca1932d7bd35</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Sat, 17 Jun 2023 18:27:47 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/06/puzzled.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/06/puzzled.jpg" alt="Puzzled Why Instagram Fails on Safari"><p>I wanted to look at Instagram to see if every art hashtag was still overwhelmed with terrible AI art, but today on Safari, all I get for every page is:</p><figure class="kg-card kg-image-card"><img src="https://thecodist.com/content/images/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png" class="kg-image" alt="Puzzled Why Instagram Fails on Safari" loading="lazy" width="1996" height="360" srcset="https://thecodist.com/content/images/size/w600/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 600w, https://thecodist.com/content/images/size/w1000/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 1000w, https://thecodist.com/content/images/size/w1600/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 1600w, https://thecodist.com/content/images/2023/06/Screenshot-2023-06-17-at-1.15.00-PM.png 1996w" sizes="(min-width: 720px) 720px"></figure><p>It works on every other browser I have. But why?</p><p>In the console are two errors, found and placed there by Instagram&apos;s error handler, the first complaining that an Ajax call is returning HTML instead of JSON, in fact, an entire HTML page. That made me curious enough to look at the source as to why.</p><p>I wish I hadn&apos;t; what a massive pile of bizarre javascript. It must have been run through some kind of obfuscator (for no reason, I can imagine, since it must be hard to understand even unobfuscated). It must have included a lot of auto-generated code, and I assume it&apos;s React. The main JS file is around 150,000 lines, mostly building huge dictionaries of functions and other dictionaries and things I have no idea what they might do. </p><p>If this is supposed to be high-quality modern web code, I am glad I spent the last ten years building iOS apps. Perhaps the code makes more sense in the original source code; I can&apos;t fathom any other way someone could possibly have a clue what this app does otherwise. It epitomizes the meme &quot;The Goggles They Do Nothing.&quot;</p><p>Clearly, Instagram/Meta would prefer everyone only look at their content on a mobile phone, they had no iPad app the last time I looked, and this attempt at a website looks amateurish at best. Yet somehow, they have a 150,000-line Javascript file. I wonder why they pay such exorbitant salaries to wind up with such an ugly, massive codebase that looks this terrible.</p><p>Looking at it on another browser (Brave, in this case, which looks like Chrome to the server), I still found most of the art hashtags poisoned by repetitive AI art (oh look, more fantasy women with significant assets). The website is also very slow to respond, which seems embarrassing for a company as large as Meta. The list of images is just a stack of square images with no filters or sorting options, and you can tap on one of them to see the image and its comments as a modal. Strangely enough, tapping on a hashtag in a comment mostly does nothing (oddly enough, it works occasionally). It appears to have been built with no Product team or QA that I would have tolerated. I can only imagine it was never taken seriously as a project. </p><p>How is this acceptable to anyone? I wonder how many engineers it takes to build something so bad yet so complex.</p><p>You might wonder why I don&apos;t use their app on my iPhone. I like seeing images larger than 4 inches across. All of my art is bigger than even my huge display can show, but at least I can fill the screen. Phone screens are fine with crappy AI art (since they degrade on larger screens), but I would rather see all the details of something I find interesting.</p><p>I wonder what people who spend their whole day building modern web apps think of this codebase. Facebook is also made along the same lines, though it works properly and involves actual UI and product design. I also imagine the team behind it is enormous. Since all I can see is the end result and wonder how it all works together, I assume the people that work on this understand it better from the source code side.</p><p>In the past, I have heard that Facebook prefers fully automated QA, but I never cared for that type of solution; I always built apps (desktop, web, and mobile over my career) used by people, and I always made sure that people were involved in testing everything (assisted by automation, when it was possible, but never exclusively.) &#xA0;People are far better at seeing strange things and finding inconsistencies; automation is better at highly repetitive testing. However, in the early days, I had QA who could test highly repetitive areas daily, which would have driven me insane.</p><p>Ultimately, I have no idea why Safari isn&apos;t working for such a strange reason. Safari can be odd sometimes (the Citibank website refused to let me log in on Safari for months, returning an error of &quot;your browser is too old,&quot; but I was told it was a bug). It makes me glad I don&apos;t work on the Instagram website, but I am curious why it&apos;s so poorly designed and written.</p>]]></content:encoded></item><item><title><![CDATA[Has Anyone Noticed How Bloated The Internet Has Become?]]></title><description><![CDATA[<p>I was on a cruise recently, and trying to read anything online was painful since thousands shared my internet connection at sea. Reading a relatively lightweight site like Google News generally gave me time to get an ice cream cone before the page appeared.</p><p>Has everyone abandoned building minimal apps</p>]]></description><link>https://thecodist.com/anyone-noticed-how-bloated-the-internet-has-become/</link><guid isPermaLink="false">6488848b3fe2ca1932d7bbd2</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Tue, 13 Jun 2023 18:13:48 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/06/loader.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/06/loader.jpg" alt="Has Anyone Noticed How Bloated The Internet Has Become?"><p>I was on a cruise recently, and trying to read anything online was painful since thousands shared my internet connection at sea. Reading a relatively lightweight site like Google News generally gave me time to get an ice cream cone before the page appeared.</p><p>Has everyone abandoned building minimal apps or at least optimizing assets? People in the U.S. assume everyone has unlimited bandwidth, but internet speeds and bandwidth are still limited in many places, even in parts of this country. In my German cousin&apos;s town, the fastest internet is still ASDL. In much of Africa, if you have the internet at all, it&apos;s doubtful that many websites can even be accessed.</p><p>My first &quot;online&quot; experience was in 1973-1974 in a programming class in a public high school (itself unusual for the time) where we used a Teletype machine connected to a timeshare computer somewhere at 110 Baud. A decade later, my manager and I had accounts on a local newspaper bulletin board that could be accessed with a 2400 Baud modem. We also had email addresses there, but the only people we could email were each other, and we sat in the same office.</p><p>In 1998 I wrote the USPS&apos;s Postmaster app, using Webobjects/Enterprise Objects. It had to run in IE 2.0, as the Post Office had paid for a license! The app had a single image and no CSS; if it had any Javascript, it probably was very little, as IE 2.0 was extremely buggy. The slowest thing about it was accessing the Post Office&apos;s database, which was poorly optimized. </p><p>In 2005 I wrote my first single-page application in Javascript using XMLHttpRequest in an internal app; my customers were amazed at how fast it was to use since it could be as interactive as a desktop app. The Architecture team (of which I was a member) yelled at me for buying an unapproved technology; they were only somewhat mollified when I said it was just Javascript running in IE 6. It was a fun app to write before any real Javascript frameworks were available.</p><p>Fast-forward to my cruise, and all I could see were loaders, spinners, or websites that I had given up on ever seeing. Even Google News, when I wanted to see a different section than the front page, was sluggish, waiting on the hamburger menu to appear so I could switch. Reading an article was mostly pointless as regular websites took too long to appear.</p><p>Recently Citibank revamped the UI of their website, and trying to log in resulted in an error where it claimed my version of Safari was too old to be supported despite my using the latest version on the latest macOS on Apple&apos;s fastest computer. After some back and forth with the CEO response unit (most public companies have them if you email the CEO), they finally admitted it was a bug, which was fixed two months later. While waiting, I looked at their Javascript to see if I could figure it out but gave up after a couple of hours, as the main Javascript file was 100,000 lines long in addition to a host of other files. I can&apos;t imagine why a Bank website has a 100,000-line Javascript file and the code looked like something that had grown randomly over two decades. I felt sorry for the poor folks who have worked on this beast.</p><p>Sometimes I look at websites in developer tools in Safari or Chrome to see how they are built. Often it&apos;s mind-boggling how many individual files are required to show a single page. MacRumors, for example, downloaded 431 files comprising 5.8MB of content. I&apos;ve seen some sites that exceeded 1000 files. You don&apos;t notice it as much on a fast internet connection, but that&apos;s nuts.</p><p>Hacker News was readable on the ship. Cnn used to have a lite website, just a list of their articles; no ads, no images, nothing fancy. Naturally, they recently removed it. The idea of small and simple seems much rarer than when bandwidth was limited. But many parts of the world cannot connect to this bloated internet.</p><p>Is anything going to change here? Of course not; optimizing assets and download speed seems to be at the bottom of anyone&apos;s list of requirements. The overwhelming prevalence of ads and video mean no one cares how long anything takes to appear. Lean websites have not entirely disappeared, but it is no longer important given the assumption of virtually infinite speed and bandwidth. &#xA0;</p><p>At one point, Google&apos;s website tools would tell you how slow your website was to download and seemed to imply that Search would penalize slow websites, but clearly, that is no longer important.</p><p>Some cruise ships and lines are trying to speed up onboard internet; the one I was on was not one of them. Maybe speeds will improve everywhere over time, but I expect websites to go beyond what they are doing now, so perhaps it will still overwhelm anyone not entirely up to speed. </p><p>Like many things, you only appreciate what you have when you don&apos;t have it for a while. Being on the cruise ship reminded me of what being online was like 50 years ago in high school!</p>]]></content:encoded></item><item><title><![CDATA[I Learned How To Program 50 Years Ago]]></title><description><![CDATA[<p>In the fall of the 1973-1974 school year, my public high school offered a class in computer programming. This class was rare for its time, as there were few computers in the world accessible for students, and most people had no idea what they could do other than seeing HAL</p>]]></description><link>https://thecodist.com/i-learned-how-to-program-50-years-ago/</link><guid isPermaLink="false">6467919f3fe2ca1932d7ba19</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Fri, 19 May 2023 16:10:12 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/05/hp.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/05/hp.jpg" alt="I Learned How To Program 50 Years Ago"><p>In the fall of the 1973-1974 school year, my public high school offered a class in computer programming. This class was rare for its time, as there were few computers in the world accessible for students, and most people had no idea what they could do other than seeing HAL in &quot;2001: A Space Odyssey&quot;. </p><p>I am forever grateful to whoever convinced the local school board to invest an entire year in what must have seemed a pointless elective. We learned to code using a 110-baud teletype connected to a time-sharing system via a telephone coupler modem. Only six students took the class. Latin was also an elective, and the course was full (note I was also in that class as the teacher was excellent).</p><p>Since then, I have been a programmer but never used Latin again. The first thing we learned was &quot;Roma Est In Italia,&quot; that&apos;s all I remember.</p><p>After that year (my family moved away, and I don&apos;t know if the class was ever offered again), I did not get to program on anything again until I bought an Apple ][+ after graduating from college with a chemistry degree.</p><p>Since then, I have been writing code almost every day. Despite having retired two years ago, the code I write daily targets my art practice and is often far more complex than anything I wrote during all those years of working. What I write today is much more specialized and focused, and there are no meetings or anything to get in the way!</p><p>Actual programming languages only go back to the early-to-mid fifties (I was born in the later fifties, so a little before me). Programming as a career option likely started soon after that, though, with very few computers, there were also very few programmers. When I began my career in 1981, programming was still a rare opportunity; many countries likely had no programmers at all. Most people I knew had zero idea how they worked or had any interaction with a computer. This began to change in the 1980s as more personal computers were sold.</p><p>Even today, most people know nothing about programming or how a computer functions internally. I am unsure how prevalent programming classes are in public high schools today. If I had been in some other school district that year I likely would have never had any idea I could spend my life writing code or realized how much I like it. I think teaching programming today is still essential in schools, if for no other reason than to let some students discover that they like it.</p><p>If you go back fifty years from when I took that class, there are no computers or programmers. The Turing Machine was still twelve years away. Sure, you could consider Babbage&apos;s Difference Engine a computer (and Ada Lovelace, the first programmer), but it was more a calculator than a general-purpose computer. Programming is still in its infancy in many ways, and who knows where it will be in fifty years; it&apos;s possible machines will replace human programmers by then. However, I am not sure that will happen given that most programming is written for people, and people are confusing to people, much less an AI. Perhaps it will find it easier to eliminate us instead.</p><p>I remember bringing home a 128K Macintosh from the office in 1985 (my startup had two Macs plus a Mac XL, i.e., Lisa) and showing it to my friends. Of course, most had no idea what to think of a mouse, bitmapped display, and things like cut-and-paste. A five-year-old child, however, figured out Macpaint quickly. Today I still meet people for whom even the basic use of a PC is beyond their ability.</p><p>If I could go back and tell myself back in 1973 where programming would go in the following fifty years, I would have found the time machine more believable than the future history of computers. Programming for a phone, yeah, right!</p><p>The programming world has changed enormously over that half-century, and I with it. I suspect changes over the next half-century will be equally unpredictable and never-ending. It&apos;s possible that there will be no programmers anymore by then, and everyone will eventually forget that we once existed. </p><p>I&apos;ve never regretted being a programmer, and despite all the ups and downs, changes, layoffs, and occasional insanity, it was the best job I could have had.</p>]]></content:encoded></item><item><title><![CDATA[I Have To Fix Broken Things]]></title><description><![CDATA[<p>Call it a character flaw or a character benefit&#x2014;I hate being around broken code, processes, products, or UI. If it&apos;s broken, I want to fix it. If I can&apos;t, it grates on me.</p><p>After I graduated from college, my parents, a friend, and his</p>]]></description><link>https://thecodist.com/i-have-to-fix-broken-things/</link><guid isPermaLink="false">6459237e3fe2ca1932d7b6cc</guid><dc:creator><![CDATA[Andrew Wulf]]></dc:creator><pubDate>Tue, 09 May 2023 15:10:51 GMT</pubDate><media:content url="https://thecodist.com/content/images/2023/05/broken.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://thecodist.com/content/images/2023/05/broken.jpg" alt="I Have To Fix Broken Things"><p>Call it a character flaw or a character benefit&#x2014;I hate being around broken code, processes, products, or UI. If it&apos;s broken, I want to fix it. If I can&apos;t, it grates on me.</p><p>After I graduated from college, my parents, a friend, and his parents went out to eat, and my dad sat in a broken chair. So he laid it down and attempted to fix it, which embarrassed me then. But today, I understand why, and clearly, I inherited it (though about technology, not furniture).</p><p>Throughout my multiple decades of being a programmer, nothing irritated me more than seeing something that should work correctly or function and having it do neither. It&apos;s been both a positive and a negative in my career&#x2014;sometimes, people were thrilled that I made things better, and a few times, I lost my position because someone did not like me trying to fix something that they preferred to leave alone (often because they broke it and would look bad). Sometimes, I had to be a little sneaky or make things better a little piece at a time.</p><h3 id="a-failure-teaches-me-a-valuable-lesson">A Failure Teaches Me A Valuable Lesson</h3><p>I learned a valuable lesson in delivering a quality application in 1987 at the San Francisco Macworld. We were showing our app Trapeze, which was about to ship (we would assemble the packages after the show). I spent several days demoing our product; the key feature was a hierarchical popup menu, we had the first Mac app with such a thing (Apple would release this in Macos a few months later), so people were interested in seeing it work. However, a statistical crash happened sometimes, which was embarrassing. During development and our testing, everyone ran a debugger in case of a crash. In those days, the debugger was in the same address space as the app and loaded first, changing low address values. The code had an uninitialized pointer, but with the debugger present was OK; we had rental Macs and no debugger for the show. It was easy to fix back in the office, but it wasn&apos;t very pleasant for the show.</p><p>What changed in me was spending more time at the start of a project considering how to build more reliable code, hire permanent QA, and have them test the whole app daily. Remember, back then, no one wrote unit tests or anything you could do today (we did not have access to a code repository as there were none). After this early failure, nothing a team I led or shipped ever was released with severe errors until I retired.</p><p>I did a two-week contract in the mid-90s to help complete a release of an email program. They had several mysterious crashes and needed another person to fix them. I spent a little time analyzing the source. I discovered someone had commented out the low-memory handling code (vital in fixed-memory macOS) a couple of releases prior, and the lines had never been uncommented. This was the key reason why the app was so fragile. At the end of the short contract, the entire team took me out for lunch.</p><p>Afterward, I worked at Apple for half a year, but this was pre-Jobs, and everything was headed toward going out of business, and I did not want to be there, so I left. That was too broken for me!</p><h3 id="i-fix-broken-things">I Fix Broken Things</h3><p>In the mid-2000s, I went to work as an architect at a financial services company, and the interview consisted of the two architects telling me horror stories to see if I would run away. So much brokenness to fix was exciting to me. So during the employee orientation, when everyone was asked what their job was, I responded, &quot;I fix broken things.&quot;</p><p>Little did I know how much stuff was broken, and I spent 18 months trying to fix everything I could get to, but it was hopeless; there was too much. My most significant success required cleverness. The mainframe folks hated the Java programmers and blamed everything on them. The worst issue was that the app servers and web servers would frequently lock up, creating a lot of complaints from our 1000 field offices and our 60,000 customers. The databases backing the apps ran on the AS/400 (DB2), and I suspected this was an issue, but the challenge was proving it.</p><p>I wrote a simple test using the DB2 JDBC driver that fetched only the first row of each of the approximately 200 tables; then, our Oracle DBA told me he routinely downloaded the entire production database from DB2 to Oracle running on an old PC. So I got him to run the benchmark against the production DB2 database and the PC running Oracle with the same tables. The difference was astounding: the little PC appeared significantly faster than the expensive AS/400!</p><p>So I wrote it up with charts and sent it to various people, which caused quite a stir. While watching metrics, the AS/400 DBA ran the same benchmarks and discovered that someone had set the DB2 partition to only 100MB and that the AS/400 was swapping like mad between the benchmark and the regular production load. It was clear that this was a deliberate choice when the CIO ordered them to increase the memory to 1GB (and promised to buy them more RAM), and all the issues instantly disappeared.</p><p>I was a happy camper!</p><h3 id="not-so-happy-fixes">Not So Happy Fixes</h3><p>Not everything I did to fix things worked out as well. In my next job (after there were too many broken things in that one), I thought I had found a great job, all Macs, a startup with a great market niche that wanted to expand its product line, and the possibility of an IPO at some point. I was the second Java programmer, as the first complained he could not do the whole project alone. Little did I know...</p><p>After a couple of months of building the new online store&apos;s front end, I felt ready to investigate what the other programmer&apos;s back end (that he talked about endlessly) did. So I went looking in the repository and found absolutely nothing. All that was in any repository was some sample code. He had been working for ten months, and there was nothing.</p><p>I reported that to the manager and then the CEO, but what the manager said to the CEO floored me. He said, &quot;Oh, he never checks in any code until it&apos;s perfect.&quot; Yet there was no proof he had done anything. So after that, I became persona non grata with him and a few other technical people. He also built a cardboard wall so I could not see his display (open office with desks), as I was the only Java programmer and could tell what he was working on. I am sure he was working on contracts for other people while taking the salary.</p><p>I tried repeatedly to convince the two other partners that something was wrong, but the CEO did not listen to them either. I stayed a couple more months, writing some code to automate the back office (printers), but I had enough and quit. A year passed, and I heard they finally fired the programmer and the manager. They did not get a new store, stayed in their narrow market, and never expanded, but remained in business. They only needed a new online store; they had a great back office and printing expertise.</p><p>The following job I took (as an Architect again) ended even worse. Our main application (a batch system written in Java, chopped up into 20 individual applications) leaked like a sieve; two people had to restart the app multiple times a day and alternated restarting them during the night (they did not accept data from our customers on weekends). It frustrated me that something so important was so broken, so I looked. The system used a C++ framework called by the Java applications. I looked in the documentation, and it indicated that the Java code needed to tell the framework when memory was no longer needed so it could free it on the C++ side. But, looking at the Java code, no such calls were anywhere. I tried to explain this to the person who had written this (now Chief Architect) and got no response. I asked the programmers who supported it, and they said they knew about it but were not permitted to fix it, as it would make the author look bad to the executive team. After that, my tenure at the company did not last long; after a period of highly hostile actions, I was laid off.</p><p>After I left, they tried to write a new application, but it failed to work, and I heard a new CIO came in, canceled the project, and went out and bought something commercial. &#xA0;</p><h3 id="back-to-fixing-again">Back To Fixing Again</h3><p>I then worked at a game company with an MMO/FPS game. The team was too small, we had little money, and the game engine was homegrown and had a lot of issues. Despite the low pay (as in 1/3 of what I could have made elsewhere), it was so much fun fixing everything! I even got to battle with a company selling a hack and winning. Sadly I stayed too long at that pay rate, so I had to leave. They are still in business and hoping to eventually move to Unreal 5.X.</p><p>Working at a well-known online travel agency, my first mobile job, allowed me to fix stuff again. This was around iOS 5 timeframe. The main iOS app had been the first travel app in the App Store and had been written by two managers as the company did not think mobile mattered, so they had no team. While it was making some sales, the app had 9000 one-star reviews (I asked about that during the interview) as it crashed continuously, I think a record count at the time.</p><p>I wound up training all of our Java programmers on Objective-C and iOS, and we took three months to rebuild the app, removing 300 warnings and 500 static analyzer errors, plus improving much of the UI. The app was far more stable and better received when we released this. Eventually, we built an entirely new app just in time to launch on iOS 7 launch day; only our app and one other appeared in the travel industry that day with full iOS 7 features. This team was very flexible and productive; this was the best place I had ever worked in many ways, but our parent company sold our brand to our biggest competitor, and everyone was laid off. Bummer.</p><h3 id="fixing-in-the-big-leagues">Fixing In The Big Leagues</h3><p>My last job was at the largest company I ever worked at. It proved to be a lot of challenges, including many opportunities for fixing things (not just code but also processes) and building mobile code that worked correctly. Nevertheless, I think it was an excellent place to end my career at.</p><p>I could list several things, but this post is too long, so I will only mention one.</p><p>When I started, I worked on a 16-month-long project and inherited a large codebase written by an army of contractors that had just shipped. I naturally wanted to see what level of crashes the code had, but no one had access to or cared to look at crash reports. So I requested access and started looking; now, I reported what I saw in the shipping Slack channel every release. At first, few people paid little attention, but two high-profile failures made people start considering crashes as important. In one case, the app, as soon as it was released showed a 50% crash rate several times a day, which alarmed everyone; it took all the leads working together to solve the issues in about two weeks (made more difficult as no one had uploaded the symbol file!). The second issue followed two months of complaints in the App Store reviews (that I also read daily),, which was initially ignored as mere venting. However, an email to our CEO suddenly made the issue significant; the crash was not being reported as it occurred in the launch method where the iOS watchdog simply killed the app as unresponsive as it was trying to sync a database and took way too long on slower iPhones. Again, this failure of the app to work correctly started a trend of more and more people looking at crash reports regularly.</p><p>The app overall had a terrible crash rate of 1-2% during this time, but now more people were into paying attention. During the pandemic, when our business was slow many projects were on hold; people started fixing the common crashes, and by the time I retired in 2021, the crash rate got down to 0.25%, and I no longer had to say anything.</p><p>I managed to slowly convince a large division to care about the crash rate of the two apps, armed with just data and simple repetition. Over time I taught several people how to properly read them as well, including non-technical staff. Finally, my need-to-fix obsession paid off!</p><h3 id="after-retirement">After Retirement</h3><p>I still care about broken stuff; websites and apps sometimes frustrate me. I went back and forth with Citibank as their website, upon login, would tell me that the version of Safari I was using was too old despite running the latest version on a Mac Studio (it also included a helpful link to a non-existent page). They admitted they finally knew about the issue, and after a few months, I saw it was fixed.</p><p>Then take Twitter. Please. My account with 5,600 followers has been bugged for a year by not showing content to my followers, only to people not following me, reducing my views to 5% of what I was getting a year ago, and even looking at the source and trying to do what I could accomplished nothing. If I worked there, I could figure it out (and would have to being obsessed with fixing things), but I wouldn&apos;t last 5 seconds with the owner!</p><p>So having to fix broken things has been a great benefit, and sometimes a terrible curse, but that&apos;s who I am, and I have no intention of changing!</p>]]></content:encoded></item></channel></rss>