Nothing New Under the Spec

I recently stumbled across Cameron Sjo‘s spec-compare project while working on a similar comparison of my own. The project is outstanding. Compiling, testing, and documenting this many tools and frameworks takes considerable effort, and the findings are immediately useful if you’re evaluating spec-driven approaches for your team. I’m grateful someone else did this work and shared it openly. If you haven’t explored the project yet, it’s well worth your time.

Then I read the Critical Analysis.

The practical observations about AI adherence, review burden, and especially the MDD parallel are solid work. Where it lost me was the historical framing. The analysis builds its central tension around whether SDD is “waterfall dressed in AI clothing.” This is built on several layers of popular misconception, and like a house of cards at a toddler’s birthday party, it doesn’t survive contact with much scrutiny.

The Page 2 Problem

The analysis states: “Waterfall (1970s): Comprehensive upfront planning, failed due to inflexibility.” Wrong.

Winston Royce’s 1970 paper, “Managing the Development of Large Software Systems,” is eleven pages long. The famous sequential diagram appears on page 2. Royce presented it as the approach that does not work. He spent the remaining nine pages describing iteration, feedback loops, and the insight that you should plan to build the system twice — aligning directly with Fred Brooks’ “plan to throw one away” from The Mythical Man-Month.

Our industry read page 2 and spent fifty years arguing about what the other nine pages said. It’s the software engineering equivalent of reading the first chapter of a murder mystery, putting the book down, and writing a five-star review about how the butler did it. Royce wrote nine more pages. Nobody read them.

“Over” Is Not “Instead Of”

The Agile Manifesto says “working software over comprehensive documentation.” The word is over, not instead of. “I prefer vanilla over chocolate” does not mean I don’t like chocolate at all. I happen to like both! I just like vanilla a little more.

The nuance evaporated immediately. Training courses simplified preferences into prohibitions. “We’re Agile, so we don’t write documentation” became acceptable to say with a straight face. The dumber versions won because they were easier to teach, easier to sell, and easier to use as an excuse for not thinking through a problem before writing code.

I held these same simplified views early in my career. The original Agile principles were considerably more thoughtful than what most of us were taught. We just didn’t bother to read those carefully, either.

The Emperor’s New Sprint

Look at how most teams practice Agile today. Two-week sprints with fixed scope. Backlog refinement that functions as requirements gathering. A daily standup that is a status report wearing a casual Friday outfit. A retrospective nobody acts on.

Strip away the vocabulary and you have a two-week waterfall cycle. We took “respond to change over following a plan” and turned it into meticulously planned two-week plans we follow to the letter.

So when the analysis asks whether specs can coexist with Agile — coexist with which Agile? The one in the meeting room already runs on specs. They’re called “acceptance criteria,” and everyone pretends that’s different.

Same Song, Different Verse

Qoheleth had something to say about new paradigms: “there is nothing new under the sun,” or in our case, “under the [spec].”

Writing specifications before building software is what we have always done. Requirements documents, functional specs, user stories — these are all specifications. The format changed. The core activity didn’t. Calling it a paradigm shift is like calling a new bread recipe a paradigm shift in wheat.

What changed is the executor. When a human reads a spec, they walk over to your desk and say “this makes no sense.” When an AI reads a spec, you get hallucinated APIs and cheerful disregard for your instructions. Those are problems with the AI, not with writing things down. Blaming the recipe because the oven is unreliable won’t help anyone bake better bread.

The more interesting axis is time scale. Classic Waterfall planned in months. Agile plans in weeks. SDD plans in hours. Same song, different verse. When your spec-to-implementation cycle is measured in minutes, the distinction between “upfront planning” and “iterative development” dissolves. The pendulum didn’t swing. We just zoomed in.

What the Analysis Gets Right

The MDD parallel is the strongest section. MDD failed because the translation from spec to code was too rigid and too opaque. LLMs offer flexibility but introduce non-determinism — like trading a car that only turns left for one that occasionally invents directions that don’t exist.

The AI adherence problems need engineering solutions, not methodology debates. Hand-wringing about whether we’ve “returned to Waterfall” distracts from fixing them.

Wrapping Up

I’m tired of these poor regurgitations of popular versions of Waterfall and Agile history. They don’t serve anyone but the marketing departments of large vendors looking to sell you the next transformation. Call me pedantic if you must, but we should continually strive to be accurate with terms. When we build arguments on a version of Waterfall from page 2 of an eleven-page paper and a version of Agile from a certification course that turned preferences into prohibitions, we’re just passing around the same wrong answers and wondering why we keep asking the same questions.

Spec-driven development is what we’ve always done, only this time with an AI reading the spec. The interesting questions are about time scale, AI reliability, and whether the tooling can mature fast enough to deliver on the promise.

Have you run into these same misconceptions on your teams? Let me know in the comments.