Why Are People Still Using C?
C is second in the latest TIOBE list of the most popular languages. I find it hard to understand why, unless there is a lot of existing code to support, I can't fathom why anyone would start something new in 2023 in C.
I first learned C in 1984, when I bought a C compiler at my first job because I thought it was about to become important to know. At the time, I was working in Pascal and various assembly languages.
The two little companies I started in 1985 and late 1987; both built everything in C. We built a spreadsheet-like program (Trapeze), worked on part of a Powerpoint competitor (Persuasion), and then built a charting application (Deltagraph), all in C. For Deltagraph, I added some object-oriented extensions to make life easier—there was no C++ yet. I didn't get to use C++ until 1994/95.
In 1996 I wrote a commercial memory allocator library (HeapManager) for the Mac, supporting Metrowerks Codewarrior. It supported C and C++ applications and was written in C++ instead of C. The library was the fastest one you could get on a Mac and included a detailed debugging library. I still have the source code somewhere.
In 1997 I worked for a few years at a consulting firm that used WebObjects, which required working in Objective-C, which is C with funky object extensions (much nicer than what I had done earlier!). However, for anything unrelated to the OO features, it was still plain C. By 2000, I pushed us to switch to Java since it appeared important for the future, and WebObjects was declining in popularity. For the next decade, I just worked in Java.
Working for a game company at the end of the decade, I found myself working in C and C++ (and Lua and Javascript) again, as the game engine we had was old and was built by people angry with each other originally—the engine was a terrible mix of technologies. I remember finding three different implementations of linked lists, which over time, had cross-pollinated!
In the next decade, I built iOS applications, which meant Objective-C again for a few years. Finally, in my last job, everything my team and I wrote was in Swift though we did have a large codebase I inherited that was in Objective-C until it was completely rewritten in Swift.
So I have spent most of my programming career in C and derivatives. Today I can't find any reason to care for C anymore. It was a fine language forty years ago; while Pascal seemed like a friendly language, it wasn't speedy, and given that computers of the day were pretty sluggish, C seemed like a better fit. Today, however, innumerable programming language choices are far better for all the many things people want to write.
Linux was always written in C, but Rust is beginning to replace parts of it even there. Go seems like an excellent language in the same vein as C but far more modern. Of course, Swift (and its cousin, Kotlin), are ideal for everything I needed to do in mobile development. Being retired now, I make digital generative art in Swift. There are so many language choices today that no one can even remember what they are called. When I started in 1981, I had a choice of one: Fortran.
C requires that you carefully manage your own memory and keep in mind the security issues of everything you do lest you leave a hole someone can exploit. In the 1980s, it wasn't a big deal, but secure coding has been essential since the dawn of the Internet, and the C language doesn't help you much. Before ARC (Automatic Reference Counting), writing Objective-C was easy to screw up since you had to manually balance Retain and Release calls on top of regular malloc/free. After ARC (which still runs under Swift), it became much easier as the compiler took care of most of the effort, though you still had to manage regular C memory.
People deal with memory and secure coding issues by running various analyzers (HeapManager helped a lot though it was single-threaded since MacOS at the time did not support real multi-threading), yet every week, you read of newly discovered security vulnerabilities.
I prefer a language that will make errors harder to make. Swift works for me since I can build more reliable and less randomly wrong code using language features to enforce quality. Writing in C and Objective-C wasn't terribly difficult, but it required much effort to ensure we couldn't screw it up. When we started writing Trapeze, the first C compiler we used in 1985 did not even support function prototypes! That required incredible discipline to ensure all function calls had the correct parameters. Having a language that keeps you from messing up is a blessing.
With Go, Python, Swift, Kotlin, Java, C#, the Lisp multitudes, PhP (if you must), Javascript (sigh), C++, and uncountable others, many languages are less primitive than C. Sometimes you have no choice since you are supporting some existing codebase and no one wants to rewrite it (that's how you wind up with 60-year-old COBOL). Deltagraph was written in C, but my team and I only worked on it for the first five years, after which we wanted to rewrite it in C++, but the publisher wasn't interested and soon sold it so they could go into a different business. The final owners of Deltagraph continued to sell and support it until the pandemic apparently put them out of business. The codebase I started in August of 1988 lived for thirty years—I can't imagine how terrible it looked after twenty-five years of maintenance after we stopped working on it! They never could make it run in 64-bit since, in 1998, that did not exist. Deltagraph remained a viable charting application after dominating for a while in the 1990s, but eventually, the inability to modify it made it unprofitable. Maybe in C++, it would have had an easier life in the future.
I know some colleges still teach C to all Computer-Science students since it is relatively easy to learn. I know a student who wants to learn web development, but their school is teaching everything in C, which is not likely to help much. Exposing students to multiple languages has to be the right approach. C by itself is neither functional nor object-oriented—it might be OK as an introduction to the idea of programming, but not beyond that.
I doubt C will ever vanish, just like COBOL won't. I remember languages that did disappear (does anyone remember PL/1, Jovial, or SNOBOL?), but C won't ever die like they did. Still, if you have a choice, pick something that makes your life easier, is well-suited to the task, and hopefully has a nice future.
free(article);