I've been watching some programming talks/reading some old blogs, and have reached a decision:

I'm not sure if the level of understanding re: writing good code has gone up in the last ~40 years, but I'm *positive* that it's gone up in the last ~15-20.

I'm not sure what it was about the early 2000s, but somehow that period really seems to have been a low point for collective understanding of software craftsmanship


> The aftermath of the dot-com bubble, maybe?

Yeah, that + the height of OO & xml + the first wave of self-taught programmers hitting before good resources were around to really teach yourself are my guesses

But it still seems like it added up to something more than the sum of its parts

@codesections I think I disagree almost 100%. Old code (1950s-1980s) tended to be tight, do one job perfectly, and GTFO, because we had such severe hardware constraints. You'll never again see a perfect thing like the original Macintosh, LISP Machines, the NeXT, early SUN, etc. And the software written for those was generally perfect.

1990s started getting bloated, junk languages like TCL, but there was also a lot of awesome stuff written then by people who still cared.

Since 2000 it's trash.

@codesections Modern software is built on towers of Jell-O. There's an OS somewhere down there, like Intel puts Minix at the bottom of their CPUs. But then Windows is VMS++ with emulation layers for Win16, Win32, and Linux APIs.

NeXT was a nice little Mach kernel, BSD userland, MacOS has emulation layers for several versions and parallel APIs now for C, Obj-C, & Swift (which is a C++ abomination).

Browsers sit on top of all that with DOM rendering, JS interpreter, JS compiler, & WASM.

@codesections We tried, briefly, to fix all that with XP (Extreme, not Windows). Shithead fucking managers who should've been drowned at birth turned it back into waterfall with "burn-down charts" and "backlogs".

Name any good software of the last 30 years? You can't. It's a pile of shit.

I like slightly-updated BBEdit from the dawn of the Macintosh, Vim copied from a 1976 editor, zsh copied from a 1970 shell, IRC from 1988. Almost everything else I hate.


> I think I disagree almost 100%. Old code (1950s-1980s) tended to be tight

We must be talking past one another, because you're making pretty much the same point that I was getting at and I agree with basically everything thing you said.

My view is that the level of craft/skill now is (much?) lower than pre-80s and that, as you said, things really got bad around 2000. The surprise was that 2000 seems even worse than now — I expected a monotonic decline

@codesections I think it's still terrible, all the NuWaterfall stuff is 21st C. The web stack is a recent horror and getting worse. I use it, because there's no alternative sometimes, but it's rotten to the core.

Apple dumped a nice, reasonably efficient, super expressive RAD language, Objective-C, and low-level C for performance, for Swift.

MS dumped C++, abomination tho it is, for C#, which is shittier single-platform Java for them to build enterprisey stacks on.


Yeah, I don't disagree. Maybe part of the change is just decreasing levels of optimism - watching (good) conference talks from the past few years, I get the sense that people are deeply aware of how flawed all that is and are having a sophisticated (ish) conversation about whether they can do anything about it.

Whereas (allegedly) good talks from ~2003 seem to feature much more simplistic optimism

@codesections I dunno if there's any better mood/PR somewhere, I'm out of the conference loop. I just judge by what gets released, and then burrow back into my based-on-1980s hole.

@codesections @mdhughes In the late 1990s there was the introduction of Microsoft Visual Studio "Rapid Application Development" and a consequent proliferation of wizard-generated code. It was totally a plague. Software autogenerated by wizard tended to turn into spaghetti very rapidly, because such systems never quite fit the requirements and always needed extra manual modification.

Fortunately that phase only seemed to last for a few years and then things got more sensible again. Or it might just have been that I moved into Linux.

Things have improved substantially in the last 15 years with the more common use of unit tests and CI.

Sign in to participate in the conversation

Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.