Vibe Coding and the Unlocking of Software Engineers

tl;dr - I spent a week doing real work with an AI coding assistant. Software engineers aren't going anywhere, but the definition of "productive" is about to change dramatically.

The Experiment

I should admit up front that I use AI coding tools heavily at work, so I'm probably more adept than most at getting value out of them. I should also admit that the AI space right now is drowning in grift — but that's a topic for another post.

When I sat down with Claude Code — Anthropic's CLI tool — to tackle some long-overdue technical debt on a couple of side projects, I wasn't expecting to be this impressed, and subsequently, inspired. I just wanted to migrate off Prismic CMS, modernize some styling, fix some mobile layout bugs, and clean up years of accumulated cruft.

What I didn't expect was how much I'd end up doing. Over the course of several sessions, the scope ballooned to include a serverless migration of a Rust backend, infrastructure fixes across multiple repos, and — for reasons I can no longer fully explain — asking an AI to check Amtrak train fares by navigating a website through a browser extension.

It kept going because the friction that normally kills side projects was largely absent.

What Actually Happened

Across two repos and about 15 pull requests, the work included:

  • A full content migration from Prismic JSON to MDX, including downloading CDN-hosted images, building custom components, and deleting thousands of lines of a custom rich text renderer. Why did I build a custom rich text renderer? Excellent question. I have no answer.
  • A theme system overhaul — 188 lines down to 51.
  • An HTTPS fix for the apex domain that had been bugging me for months — changes across two repos and three AWS services.
  • A serverless migration of my codenames game's Rust API from EC2 to Lambda Function URLs behind CloudFront. No more server to babysit.
  • CloudFront access logging added to both sites — because I was curious whether anyone actually visits them. (Someone played codenames on March 15th. Whoever you are: hello. Also, let me know how many bugs you found.)

Each of those bullets represents something I'd been meaning to do for months or years. Always worth doing, rarely started, and almost never finished. Especially when you're doing it alone, in your spare time, while also holding down a day job.

The "Unlocking" Thesis

There's a narrative in the industry right now that AI is going to replace software engineers. I'm obviously biased, but I think that narrative misses the point. What I experienced over these sessions wasn't replacement. It was unlocking.

The hard parts of software engineering have never been typing code. They're judgment calls. Should this be a Lambda or a container? Is this abstraction going to help us or haunt us in six months? Why is this CSS Grid layout overflowing on mobile when the inspector says everything fits? These are the questions that actually determine whether a project succeeds, and they require the kind of contextual understanding that comes from years of building things, breaking things, and maintaining the things that other people built and broke before you.

What AI removes is the tax between having an answer to those questions and seeing it realized.

I knew my apex domain needed a SAN on the certificate, a CloudFront alias, and a Route53 record. What I didn't want to do was spend an hour remembering the CDK API for ACM certificates, cross-referencing the SSM parameter store, and debugging the CloudFormation diff. Claude handled the mechanical translation from intent to implementation while I focused on whether the approach was right.

That's not replacement. That's leverage.

What Doesn't Change

I want to be clear about what AI didn't do well, because this is where I think a lot of the hype gets ahead of reality.

Debugging was iterative, not instant. A mobile layout overflow bug took multiple rounds to resolve. Claude did the actual investigating — inspecting computed styles through a browser extension, measuring element widths via JavaScript — and it did eventually find the root cause and fix it. But not before several wrong turns patching symptoms. It tried constraining grid columns, adding width overrides, and clamping overflow — all reasonable-sounding fixes that didn't address the real problem, which was that the CSS Grid was unnecessary in the first place. It got there, but it took iterations that I had to sit through and evaluate. And I got lucky — it just as easily could have spiraled into a Rube Goldberg machine of workarounds that masked the issue without fixing it. That's the thing about AI-assisted debugging: when it works, it's fast. When it doesn't, you're watching an intern with infinite confidence dig a very sophisticated hole.

Operational knowledge filled the gaps — for now. Claude didn't know that my CDK stack needed specific environment variables. It tried npm when my repos use yarn — more than once. It didn't anticipate CloudFront's log delivery delay, or that zombie Next.js processes would pile up on sequential ports. Some of these mistakes — like the package manager mix-up — are the kind of thing that better context retention will fix soon enough. Others, like knowing which env vars a stack needs or recognizing when a process is orphaned, come from having run software in production and cleaned up the messes. I suspect the gap will narrow, but it's real today.

Taste is still a human concern. When I asked Claude to replace the Skills section on my resume, it could generate prose, but the decision to replace a buzzword grid with a narrative about my career arc — and the judgment that this would better represent me to colleagues — was mine. AI can execute on a creative direction. It can't tell you which direction is right for your audience.

What Does Change

That said, the magnitude of what a single motivated engineer can accomplish in a short period of time is about to increase dramatically. And I think the implications of this are widely misunderstood.

The laptop becomes optional. This post itself was written by prompting Claude and iterating through PR review comments — most of which I typed on my phone via Telegram while sitting on the couch. The words aren't all originally mine, but I wrote the prompts and the PR comments, so I'm claiming credit. (If you want to see the raw back-and-forth, check the PR discussion — it's all there.) I deployed infrastructure changes, checked CloudWatch metrics, and reviewed diffs without opening a laptop. A fleeting idea on a Sunday evening used to require firing up a dev environment. Now you can just pull the slab of plastic, glass, and silicon out of your pocket and start building. For people who like to build things, that's a big deal.

Side projects become viable again. Every engineer I know has a graveyard of half-finished projects. Not because the ideas were bad, but because the distance between "I know what to build" and "it's built" involves dozens of hours of mechanical work that's hard to justify when you have a family, a job, and approximately zero spare time. That distance just got a lot shorter.

The "full-stack" engineer becomes real. I'm not even sure what I am primarily anymore. An operating systems engineer? A builder of security-related things? I don't know what any of that means, exactly, and CSS has always been the part where I slow down and start Googling. Having a tool that can translate "make this not overflow on mobile" into the right combination of max-width, overflow, and box-sizing declarations — while I verify the result — means I'm less constrained by the boundaries of my deepest expertise, wherever those happen to be this year.

Infrastructure work loses its activation energy. (Full disclosure: Claude came up with this metaphor and I liked it enough to keep it. For the non-chemists: activation energy is the minimum energy needed to start a reaction. The reaction might be thermodynamically favorable — worth doing — but if the barrier to getting started is too high, it just doesn't happen.) Adding CloudFront access logging to two sites required maybe 20 lines of CDK code total. But without a tool like this, the actual work involves reading CDK docs, remembering the S3 bucket ownership requirements for CloudFront log delivery, and testing the deployment. That's an hour of context-switching for something that delivers value in minutes.

Maintenance becomes approachable. The codenames project hadn't been meaningfully updated in over a year. The frontend was on old versions of Next.js and React, the backend was running on an EC2 instance I was paying for but rarely thinking about, and there was Storybook config for a project with maybe four components. This is now the second time in this post I'm confronting a past decision that made absolutely no sense, so I'm starting to see a pattern. Cleaning all of that up in a single session — upgrading dependencies, migrating to serverless, removing dead code — is the kind of housekeeping that compounds in value but never feels urgent enough to prioritize.

The 10x Engineer, Revisited

The "10x engineer" has always been a controversial concept. But if there's any truth to it, it was never about typing speed. It was about leverage — the ability to identify the highest-impact work and execute on it efficiently.

AI doesn't create 10x engineers. It lowers the floor. An engineer who already has good judgment about what to build, how to structure it, and where the risks are can now move at a pace that was previously only possible with a team. The bottleneck shifts from "can I implement this?" to "should I implement this?" — which, frankly, is where it always should have been.

But there's a tension here. Lowering the floor also lowers the barrier to entry, and that's a genuinely good thing for the world — more people building software means more problems getting solved. But I don't know what happens when an entire generation of engineers never needs to learn certain fundamentals. I've been reasonably productive over a 15+ year career and I've written exactly one line of assembly. That worked out fine — the abstractions I relied on were solid enough that I rarely needed to go deeper. But I've also felt the pain of not knowing fundamentals. My worst professional mistakes — and my worst side project decisions (see: the custom rich text renderer) — almost always trace back to not understanding something well enough and building on top of it anyway.

If AI makes it even easier to build without understanding, does that produce better engineers or more confident ones? I don't have a clean answer. What I do know is that the engineers I respect most have always been the ones who know why things work, not just how to make them work. That instinct — the compulsion to understand rather than just ship — might matter more than ever in a world where shipping is trivially easy.

I think AI is going to help a lot of people. I also think it's going to hurt a lot of people. The grift is a symptom of the genuine promise of the technology — there's real gold in the hills, which is exactly why so many people are selling fake maps. Between the hype, the misunderstanding, and frankly the greed, real damage will be done. Jobs will be displaced before new ones emerge to replace them. People will build systems they don't understand and ship them to users who'll pay the price. That's not a reason to reject the technology. But it's a reason to be honest about it.

I genuinely believe we — humanity, not just engineers — will be many times more productive. But not by the ridiculous multiples that many are pitching for the sake of acquiring other people's money, for reasons both noble and nefarious. For software engineers specifically, I don't think we're going to see fewer of us. I think we're going to see engineers who are less constrained by the mechanical parts of the job, more focused on the parts that actually require human judgment, and dramatically more productive as a result. But only if they bring the judgment. The tool doesn't supply that part.

That's not replacement. That's what I mean by unlocking.