we used to develop software differently

not too long ago, software development was a tight, manual craft. one developer, one tester. file-by-file. line-by-line. the scope of what a single person could build and own was small by necessity.

then came git. then open source exploded. suddenly you were standing on the shoulders of giants — pulling in libraries, frameworks, entire ecosystems that someone else had already built and battle-tested. a single developer could now own significantly larger slices of a system. the blast radius of one person’s output kept growing.

then stackoverflow happened. knowledge that used to live in the heads of a few senior engineers, buried in mailing lists or paid certification books, was now freely searchable and openly discussed. the gap between a junior and senior narrowed — not in judgement or intuition, but in raw information access.

then came AI. first as a chat interface. people started throwing problems at chatgpt instead of scrolling through old threads. it felt like pair programming with someone who had read every stackoverflow post ever written. slowly but surely, “let me google that” became “let me ask claude / chatgpt”.

where we are now

today, people aren’t just asking AI questions — they’re handing it the keyboard. cursor, windsurf, copilot. entire features being written by agents with a human occasionally glancing over to approve a diff. the term “vibe coding” exists now, which would have sounded absurd five years ago. you describe what you want, the AI writes it, you run it, it mostly works.

what used to take a team of five is now being done by one person and a few AI agents working in parallel. the economic pressure here is real — not just for big companies cutting costs, but for small ones finally able to build things that used to require funding and a whole engineering org.

is this a bubble?

every time something new disrupts the way we work, someone calls it a bubble. and sometimes they’re right. but the gartner hype cycle is not the same as “it’s going to zero.”

my take: this is not a bubble. not in the way the dot-com era was. the difference is that something has fundamentally and irreversibly changed in how software gets written. models aren’t going to get worse. the tooling isn’t going to regress. the productivity gains are already real and already shipped.

the hype will stabilize. some companies riding the wave purely on narrative will fall. but the underlying shift — AI as a core part of the development loop — that’s permanent.

the airline captain

here’s the analogy i keep coming back to.

think about what it meant to fly a commercial plane in the early days of aviation. you needed a large crew. there was a pilot and co-pilot, yes — but also a dedicated navigator charting the course manually, a radio operator handling communication, often a flight engineer monitoring instruments, and backup pilots on longer hauls. a flight was a coordinated effort of many specialists doing distinct, irreplaceable jobs.

then technology started eating those roles.

radio communication got standardized and simplified — the dedicated radio operator became unnecessary. navigation got computerized — the navigator’s role was absorbed by instruments the pilot could read directly. better autopilot systems meant fewer people needed in the cockpit at all times. one by one, roles that used to require dedicated humans were handled by better tools. the crew shrank.

but here’s what didn’t happen: the pilot didn’t disappear.

even today, with planes that can take off, cruise, and land on autopilot — even with autoland systems certified for zero-visibility approaches — there is still a captain in that seat. required by regulation, yes, but also by reality. because the captain isn’t just there to move control surfaces. they’re there for everything the autopilot wasn’t designed to handle. the flock of birds. the unexpected hydraulic failure. the runway incursion. the judgment call that no system could have anticipated.

fly-by-wire and vibe coding

there’s another layer to this worth digging into: fly-by-wire.

in older aircraft, when a pilot moved the yoke, they were physically connected to the control surfaces — cables, pulleys, hydraulics. you moved it, the plane moved it. direct mechanical cause and effect.

fly-by-wire broke that connection. now when a pilot inputs a command, they’re expressing an intent to a computer. the computer translates that intent into the precise combination of movements across dozens of surfaces, accounting for stability, load, flight envelope limits — things no human could compute in real time. the pilot says “i want to climb and turn right,” the computer figures out exactly how.

sound familiar?

this is basically what vibe coding is. you express intent — “build me a form that validates email and submits to this endpoint” — and the model figures out the implementation. you’re not writing the code, you’re piloting the system.

the thing is, pilots who fly fly-by-wire planes still deeply understand aerodynamics. they still know what the plane is doing and why. they can recognize when the system is doing something unexpected. they can intervene. they have the mental model even if they’re not manually executing every step.

that’s the bar for software engineers going forward.

what this means for us

the roles that are purely mechanical — writing boilerplate, translating a spec into obvious code, copying patterns from one file to another — those are going the way of the radio operator. not immediately, not all at once, but directionally, yes.

what doesn’t go away is the person who understands the system. who can debug the thing the AI wrote but doesn’t understand. who can recognize when the output looks right but is subtly wrong. who can make architectural decisions that require context no prompt can fully capture. who can be accountable when production breaks at 2am.

the captain still has to know how to fly the plane.

i’ll admit there’s a version of the future where that bar keeps getting higher — where what we call “understanding the system” today becomes what “moving the control surfaces” was in 1950. a mechanical task, eventually automated. we started by saying AI can write small functions but not full features. then it could write full features but not full applications. now there are demos of full applications being built autonomously. it’s not unreasonable to keep asking: what’s next?

i don’t have a clean answer to that. but i think the honest thing is to hold both truths at once — the near-term reality, where humans with strong fundamentals are still very much needed, and the longer-term uncertainty, where we should probably stay curious and stay adaptable rather than getting too comfortable.

the captain is still flying. but the cockpit keeps changing.