Reverse Gravity

Automation, But Make It Exciting

With the news about Block laying off half of its workforce “because of AI,” I was reminded of something from David Graeber’s book Bullshit Jobs.

In Bullshit Jobs, Graeber recounts stories from efficiency consultants who would go into companies and try to figure out what people were actually doing all day. They kept running into the same conclusion: a surprising amount of the work could be replaced with something about as complicated as a shell script.

Not a neural network.
Not a trillion-parameter model.
Just… a shell script.

And what’s interesting is that this almost never happened.

Companies weren’t racing to replace people with cheap, reliable automation. In many cases they did the opposite. Managers liked having large teams. Headcount was status. The number of people reporting to you was part of how importance was measured inside the organization.

So the barrier to automation wasn’t really technical.

It was organizational.


The Automation That Never Happened

Graeber’s book came out almost a decade ago, well before the current LLM hype cycle. Which makes the present moment a little strange.

For years, companies were perfectly happy not replacing work with simple, deterministic tools that were inexpensive, transparent, and easy to reason about.

But now a very expensive probabilistic system shows up, essentially a gilded Magic 8-Ball running on billions of dollars of infrastructure, and suddenly this is the thing that’s going to justify firing everyone.

There’s also a fairly obvious engineering sanity check here.

If you were actually trying to automate work inside a company, the first thing you would try wouldn’t be a trillion-parameter probabilistic model running in a datacenter somewhere.

You’d start with the simplest deterministic thing that could possibly work.

  • A script
  • A cron job
  • A small internal tool

Something cheap.
Something predictable.
Something you can debug.

That’s just basic engineering practice: solve the problem with the simplest system that actually works.

Companies have had the ability to do that for decades.

And in many cases, they didn’t.

Which is… interesting.


Talent Hoarding

There’s also another angle here that people in tech tend to pretend is mysterious: overhiring.

In hot labor markets, some companies don’t just hire because they have work for people. They hire because they can. They hire because it denies competitors access to the same talent pool. They hire because having a “bench” feels like a strategic advantage, even if nobody can articulate what the bench is supposed to do on Monday morning.

Headcount becomes a defensive resource.

And when the cycle turns, those same companies suddenly discover “efficiency.”

Conveniently.


Deterministic vs Probabilistic

There’s also a technical wrinkle here.

A shell script is deterministic. If it works once, it works every time. If it breaks, you can read the script and see exactly why.

It’s understandable.
It’s debuggable.

The AI version is probabilistic.

Sometimes it works.
Sometimes it hallucinates APIs that don’t exist.
Sometimes it invents facts.
Occasionally it deletes the wrong table in a database.

Now to be clear, this isn’t an argument that these systems are useless. Obviously they can do useful things.

But usefulness isn’t the claim being made here.

The claim is that they suddenly justify replacing entire departments.


The Economics Are Also Strange

And the cost difference isn’t small.

A shell script runs on the same machine that’s already doing the work. It costs essentially nothing.

The AI version requires datacenters, specialized hardware, enormous energy consumption, and entire companies worth of infrastructure engineering just to keep the system online.

So in the one scenario where automation was cheap and obvious, nobody bothered.

But now that automation requires industrial-scale compute infrastructure, suddenly the economics are supposed to make sense.

Which raises a question.

If companies didn’t replace people with shell scripts when they easily could have…

Why exactly are we pretending the stochastic text generator is the thing that’s finally going to do it?


The Narrative

One possible answer is that the technology itself isn’t the interesting part.

The narrative is.

“We replaced people with shell scripts.”

That doesn’t sound visionary.

“We’re restructuring around AI.”

That does.

And that might explain why stories like the recent layoffs at Block get framed the way they do.

Because historically, companies didn’t avoid automation when it was hard.

They avoided it when it was boring.


The shell scripts were cheaper.
The shell scripts were simpler.
The shell scripts were more reliable.

They just didn’t have a hype cycle.

Building a Blog to Start a Blog

I needed to start a blog. So I wrote a static site generator yesterday afternoon.

The existing tools all have this fundamental problem where they're trying to be everything to everyone. Hugo has 100,000 lines of Go code. Jekyll needs an entire Ruby environment. Gatsby wants you to write REACT COMPONENTS just to generate static HTML.

It is absolute madness. We are bringing the weight of an entire distributed application ecosystem to bear on a problem that can be solved by a couple of for loops.

And don't even get me started on the alternative: WordPress.

The Caching Lie

If you aren't using a static site generator, you are probably using a CMS like WordPress. This stack is built on a pile of contradictions. You have a server running PHP (a dynamic interpreted language), spinning up processes, connecting to a SQL database, and executing thousands of lines of code on every single request just to serve a page that hasn't changed in three years.

Because this is famously slow, the "industry standard" solution is caching. You install a plugin that takes that dynamically generated HTML and saves it to a static file on the disk.

Think about that. You are running a massive, bloated dynamic engine to generate a static file, just so you can serve the static file. You are effectively running a slow, buggy compiler on every single web request.

Why are we doing this? If the end result is a static file, why don't we just generate the static file once, offline, and be done with it?

The Handmade Solution

Rather than submit to this madness, I spent a few hours writing a custom generator in C. It has zero external dependencies, it compiles instantly, and it builds the entire site in the time it takes you to blink.

Here is how it works, and why it's better than the "industry standard."

The Memory Strategy That Isn't

The first thing I did—and this set the tone for the whole project—was grab 100MB of memory at startup:

Arena arena = arena_create(100 * 1024 * 1024);

Every allocation in the program comes from this arena. No malloc, no free, no memory management anywhere. The program runs for 40ms and exits. The OS reclaims the memory. Done.

This decision made everything else trivial. Functions can return strings without worrying about ownership. The markdown parser can allocate temporary buffers without cleanup. The template engine can concatenate strings without thinking about it.

The Markdown "Parser" That Isn't

The markdown implementation is 250 lines of the simplest code you've ever seen. It's literally a single pass through the text looking for patterns.

See **? Toggle bold. See a backtick? Toggle code. See a line starting with #? Count the hashes, emit a header. No AST, no parse tree, no visitor pattern. Just a for loop and some if statements.

Will it handle every edge case in the CommonMark spec? No. Do I care? Also no. It handles my markdown, which is the only markdown it needs to handle.

Compare this to WordPress's Gutenberg editor. They store content as JSON-serialized blocks in the database. JSON! In the database! For blog posts! They took Markdown, a format explicitly designed to be human-readable, and replaced it with nested JSON structures. This is what happens when you have 500 developers trying to justify their salaries.

Templates Without the Programming Language

The template engine is 50 lines of code that replaces {{VARIABLE}} with values. That's it. No expressions, no filters, no inheritance, no blocks. Just string substitution.

If I need logic, I write it in C:

// Generate pagination - in actual code, not a template DSL
if (page == 1) {
    output_file = "index.html";
} else {
    sprintf(output_file, "page_%d.html", page);
}

Every template language eventually grows into a bad programming language. So I skipped that evolution and just used C from the start.

Static Files: A Technology So Advanced We've Forgotten How It Works

Here's what happens when someone visits my blog:

  1. Nginx sends a file.

Here's what happens with WordPress:

  1. Apache receives request.
  2. Loads PHP interpreter.
  3. PHP loads thousands of WordPress files.
  4. WordPress connects to MySQL.
  5. Queries database.
  6. Loads theme.
  7. Loads plugins.
  8. Processes shortcodes.
  9. Renders templates.
  10. Returns the same HTML as last time.

My site could get posted on the front page of every news aggregator simultaneously and not break a sweat. You know why? Because CDNs have spent 20 years learning how to serve static files. It's a solved problem.

Meanwhile, WordPress sites need special "high-performance" hosting that's just regular hosting with more caching layers. You're literally paying extra for the privilege of working around the software's architecture.

Why This Matters

40% of the web runs on WordPress. Think about the aggregate CPU cycles wasted querying databases for static content. Think about the millions of dollars spent on caching plugins and CDNs to work around the fundamental architecture. Think about the human hours lost to plugin updates, security patches, and debugging why the cache won't clear.

All to serve text on the internet. Text that doesn't change.

I wrote this generator in an afternoon. It's 1,000 lines of C. It builds my entire site in 40ms. The output is portable HTML that will work on any web server forever, requires zero maintenance, and has no attack surface.

This isn't minimalism for its own sake. It's recognition that most "features" are actually liabilities, most "abstractions" are actually complications, and most problems are actually simple if you stop trying to solve everybody else's problem at the same time.

The best code is the code you don't write. The best features are the ones you don't build. And sometimes, the best solution to starting a blog is to spend an afternoon writing exactly the tool you need, and nothing more.