We mourn our craft

I didn’t ask for this and neither did you.

I didn’t ask for a robot to consume every blog post and piece of code I ever wrote and parrot it back so that some hack could make money off of it.

I didn’t ask for the role of a programmer to be reduced to that of a glorified TSA agent, reviewing code to make sure the AI didn’t smuggle something dangerous into production.

And yet here we are. The worst fact about these tools is that they work. They can write code better than you or I can, and if you don’t believe me, wait six months.

You could abstain out of moral principle. And that’s fine, especially if you’re at the tail end of your career. And if you’re at the beginning of your career, you don’t need me to explain any of this to you, because you already use Warp and Cursor and Claude, with ChatGPT as your therapist and pair programmer and maybe even your lover. This post is for the 40-somethings in my audience who don’t realize this fact yet.

So as a senior, you could abstain. But then your junior colleagues will eventually code circles around you, because they’re wearing bazooka-powered jetpacks and you’re still riding around on a fixie bike. Eventually your boss will start asking why you’re getting paid twice your zoomer colleagues’ salary to produce a tenth of the code.

Ultimately if you have a mortgage and a car payment and a family you love, you’re going to make your decision. It’s maybe not the decision that your younger, more idealistic self would want you to make, but it does keep your car and your house and your family safe inside it.

Someday years from now we will look back on the era when we were the last generation to code by hand. We’ll laugh and explain to our grandkids how silly it was that we typed out JavaScript syntax with our fingers. But secretly we’ll miss it.

We’ll miss the feeling of holding code in our hands and molding it like clay in the caress of a master sculptor. We’ll miss the sleepless wrangling of some odd bug that eventually relents to the debugger at 2 AM. We’ll miss creating something we feel proud of, something true and right and good. We’ll miss the satisfaction of the artist’s signature at the bottom of the oil painting, the GitHub repo saying “I made this.”

I don’t celebrate the new world, but I also don’t resist it. The sun rises, the sun sets, I orbit helplessly around it, and my protests can’t stop it. It doesn’t care; it continues its arc across the sky regardless, moving but unmoved.

If you would like to grieve, I invite you to grieve with me. We are the last of our kind, and those who follow us won’t understand our sorrow. Our craft, as we have practiced it, will end up like some blacksmith’s tool in an archeological dig, a curio for future generations. It cannot be helped, it is the nature of all things to pass to dust, and yet still we can mourn. Now is the time to mourn the passing of our craft.

52 responses to this post.

  1. Michael A Breeden's avatar

    Well, there is still the problem that the code must be reviewed by a senior developer. It just needs to be and probably always will be. So the question is how of people get to be senior developers if the AIs van doo all the entry level stuff just fine.

    Really, it is far more than that. Our capitalism economy is dependent on labor. The Economic Loop: Labor->Wages->Income->Consumption->Revenue for company … back to labor All parts are needed for economy, a system that was built. Capital will always want to replace labor, to cut costs, but that will break the economy. Capitalism self destructs. Survival is not built into corporations, mostly short term profit and competitive advantage. Expect: Acceptance, then Panic, then re-branding (cognitive workers are essential), then protection (bail out and regulation). When automation becomes a threat to order by embarrassing the cognition workers (and company officers). When recommendations from AI out perform CEOs, lawyers, etc. Then who gets bailed out?

    I’m pretty far along at solving this problem, but it will take a little longer. It is a far larger problem than it looks. The real problem, ultimately, will probably be over production.

    Reply

    • Nolan Lawson's avatar

      Well, there is still the problem that the code must be reviewed by a senior developer.

      I used to think this is true, but I’ve come to believe the next step is to have AI code reviewers. (And refactorers, and security auditors, and performance benchmarkers, etc.) I cover this a bit in AI tribalism. This is the future that projects like Ralph and Gas Town are pointing toward.

      If you’re right and we’re staring down the barrel of a civilizational collapse, then I don’t necessarily disagree with you, but I don’t think it changes my conclusion much. If we’re due for a reshuffling of job titles then I have no idea what comes next, but it probably looks very different from the past.

      Reply

      • Michael A Breeden's avatar

        Who said civilization collapse? The last time something like this happened was the invention of the printing press. It made a mess, but did not cause civilization collapse. You either need to fix capitalism or replace it. Both things can be done, but really you need to understand what capitalism is. Capitalism, meaning using capital as a tool of production, is needed in all systems from theocracies to monarchies to democracy to socialism. The thing is that “capitalism” has been given another meaning: ownership. It is the reason for the term “ownership class”. It is is easy to see the problem with that because then capital is used as a means of control as a bludgeon or to buy the law. The end result is wealth concentration like we see today. Automation will just make that more true. Wealth concentration will make that more true., Then when the enough of the wealth is controlled by the ownership class that families are unaffordable, the system changes, either by adjustment of law and social contract or the pitchforks come out. Many of the zillionaire class have been saying for years that taxes have to be higher. The thing is that the ownership class isn’t as much the problem, many of them create wealth or guide it to production by capital investment. Unfortunately, part of the ownership class is the “control class”. The ones that want to use wealth as a method of control are exhibiting dominance instinct, which has no off switch. That is the “control class”. They are the ones that have bought the law and fight for control of the working classes. They dominate wealth to do it and are the reason greed can be such a danger, not because of desire for wealth, but desire for control. Wealth is how they gain control. Somehow their control must be removed. It is best if it is done by law, that is taxation, because that preserves order. Nations, businesses and families thrive in order and investment. Historically though, what has happened is that people do not have the “wealth”, the resources, to survive due to Capital Accumulation, and the pitchforks come out – bad for everyone. That control class will call any attempt to limit their control “communism”.

  2. George Dorn's avatar

    Posted by George Dorn on February 7, 2026 at 9:37 AM

    “This is inevitable” is industry propaganda, though. This is far from inevitable.

    First, it is entirely likely that the better models simply cannot be economically viable without a technological breakthrough reducing energy use and hardware costs, and when the giant pool of free investment money finally runs out and subsidies are ended, that harsh truth will crush the coding agent market.

    Second, we are generating tech debt at a rate faster than any time in history. Back in 2007, Greg Jorgensen wrote “Introduction to Abject-Oriented Programming.” It was meant as satire at the time, but read that and tell me it doesn’t perfectly describe how Claude writes code. We’re fighting a losing battle trying to keep code quality high; there was already pressure to compromise (thus the celebration of “10x devs”) and most orgs are going to be willing to make that compromise if it lets them ship software on time for once. What happens when that tech debt finally catches up?

    Third, regulation is still not impossible. Hard to imagine in the current political situation, but when the pendulum swings back and everybody’s electricity bill has doubled, the political will to put an end to the industry’s externalities might materialize.

    And all of that is assuming this investment bubble, 17x that of the dotcom era, somehow doesn’t pop. Who’s going to pay to run the models if it does?

    I say all this as someone who has used Anthropic’s models daily for the last six months, both personally and professionally, who works at a YC-backed startup that was funded specifically because of the AI pitch deck. I’ve hand-written maybe a hundredth of the code I’ve committed in that time. I see the potential in the tools, but the industry is nowhere near ready to handle the level of deployment we have now, and neither are the economy, the power grid, or the climate. My only hope is that in the aftermath of the upcoming crises, we take a long hard look at the dangers and start focusing on mitigation, not expansion.

    Reply

    • Nolan Lawson's avatar

      I agree with a lot of what you said. Claude currently produces a lot of duplicate, verbose code, which is ironic because this is exactly what makes life hard for the next agent. But as I mentioned in another comment, I think this can be solved by additional steps of AI code reviewers, AI bug hunters, AI tech-debt-payer-downers, etc. I’ve already started using this technique in my work, and it shocks me how throwing more AI at the problem really does solve it. This is why Gas Town and Ralph scare me so much – if you’re trying to one-shot or one-conversation everything, you’re going to produce junk, but that doesn’t mean it’s impossible not to produce junk.

      When the tech debt catches up, it’s definitely going to be interesting! See my essay The collapse of complex software. The smart companies won’t wait for this to happen, but I do think many will.

      Regulation is maybe the only thing that can stop this future; you’re right. I do think though that the energy costs will go down over time, since there’s every incentive to build more efficient models and hardware (we’re still reusing chips meant for gamers, recall). I have a coworker who uses Kimi K2.5 which can run on your own hardware if you want, although he uses a service since it’s faster. I expect though that we’ll run into a Jevons Paradox where even if the energy costs go down, we’ll just use more of it, so who knows how high the energy bill will go.

      Reply

  3. Wim's avatar

    I’m happy that I’m no longer in the game. I can enjoy my level of knowledge and nostalgia.

    Reply

  4. Justin Van Winkle's avatar

    Posted by Justin Van Winkle on February 7, 2026 at 11:31 AM

    They can write code better than you or I can — no they can’t, they can code fast, and maybe they can code better than you, but they haven’t passed me quite yet. Good luck to you.

    Reply

  5. Yann's avatar

    “I don’t celebrate the new world, but I also don’t resist it.”

    This part really sadden me the most…

    Reply

  6. findingscheme's avatar

    Posted by findingscheme on February 7, 2026 at 11:54 AM

    LLMs are not better at programming than me. If they were, I wouldn’t be inspecting their output with a fine-toothed comb, I’d be rubber stamping it. I wouldn’t be constantly amending Markdown files telling them not to do things no experienced programmer would do. If you are mourning this craft, why are you choosing to play into a false narrative crafted by billionaires trying to secure an exit before the bubble bursts?

    Reply

    • Nolan Lawson's avatar

      I hear you, but please read my other replies in these comments.

      Reply

      • Roamer's avatar

        Posted by Roamer on February 7, 2026 at 1:20 PM

        if you’re suggesting the answer to crappy AI code is more AI reviewers then maybe you’re part of what’s making it inevitable. people who are willing to accept more bad food than less good food. people who judge code by the kilo.

  7. Morty's avatar

    Posted by Morty on February 7, 2026 at 12:46 PM

    These LLMs we see nowadays have lots of errors (e.g., lack of memory), and they couldn’t be fixed totally even in 10 years unless they figure out a new algorithm, a new primitive solution; Indeed it would take years for AI researchers and engineers to practically invent a brand new AI model that could really beat humans at any intellectual skill. That would be the AI that we should be scared of.

    And yes, perhaps lots of us are going to have a bad ending.

    Reply

    • George Dorn's avatar

      Posted by George Dorn on February 7, 2026 at 1:55 PM

      Yeah, it’s easy to underestimate just how much the LLM jump was completely unexpected, unplanned for, and, critically, unrepeatable. That first jump was huge, even on consumer-grade hardware, compared to what was before. But every time we 10x the size of the model (or the number of co-working models), we don’t get anywhere near 10x improvement. So the tech sector is spending obscene amounts of money for each modest increment, and the diminishing returns are already underwater. And that’s just training, nevermind the cost of hardware and execution…

      There are some interesting developments around the edges, but the big companies are aiming mostly at more data and larger models, and training for benchmarks to encourage investment, and that way is a dead end.

      Reply

  8. Adam Stankiewicz's avatar

    Posted by Adam Stankiewicz on February 7, 2026 at 12:56 PM

    We’re none and all of these now, as collective, but I hear you. I am alive and experience very things you say. Hearbreaking

    Reply

  9. nathan7's avatar

    Posted by nathan7 on February 7, 2026 at 2:13 PM

    i want puke on your face after ive read “They can write code better than you or I can, and if you don’t believe me, wait six months.”

    Reply

  10. I's avatar

    Yeah, look, I sympathise with the existential panic – I feel it too – but… we’re just not there yet. In fact I don’t think we’re ever going to be. The ‘synthetic ceiling’ is real.

    Reply

  11. Dahmonium's avatar

    Posted by Dahmonium on February 7, 2026 at 2:49 PM

    man, i am feeling this and mourn with you.

    almost 47 and as a developer seen alot that came and went. this thing is different and too big to ignore or fight against. true, if you can’t beat ’em, join ’em. but the mourning is not only for my soon to be archaic craft. i also mourn for us as humans. Our crafts are the first victims of this new apex predator.

    Reply

    • Nolan Lawson's avatar

      Yep, I remember when ChatGPT first came out, and I was talking with some coworkers in a bar about it, and I basically said that I had worked in natural language processing, and I had seen all the silly Markov models and next-word predictors, and I admitted this thing felt different, but it must just be the same thing. I felt uneasy about it, though. In retrospect I might have just been coping.

      Reply

  12. blacklemon67's avatar

    Posted by blacklemon67 on February 7, 2026 at 3:12 PM

    Hey Nolan, I want to put this out there for you:

    I still use Pinafore as my main fediverse client. Not Semaphore, not Enaphore, and not any other fork that was made since. Admittedly, part of why I still use it is because I don’t want to go through the admittedly easy process of logging into my accounts again. But the other reason is sentimentality.

    Pinafore, in my opinion, is the best fediverse client. I think that, becuase it the product of being very mindful about what should or shouldn’t be used in the code, what features should or shouldn’t be implemented, and even just the simple fact that the person who worked on it cared. You showed that you cared by maintaining it. By adding accessibility features and performance improvements. I think you showed care through your labor by making pinafore run well on KaiOS—I still remember that, I think that was cool as hell.

    I still remember the first of just two small commits that I contributed to pinafore. This was to allow you to switch instances quicker by adding a “star” button to each item in the instance list. You took the time to review it, even fixing my code after merging, and made sure I was credited in the release notes. I’ve contributed to a number of FOSS projects, usually drive-by commits, and very few of them have been as welcoming as how you’d managed pinafore. That sticks with a person.

    Sure, maybe in six months I could vibe-code my own pinafore. Maybe I could do that now. But I would never have met you, and you would never have met me. The craft gave us the opportunity to work together, however briefly, on something shared. I think that matters.

    I got to see what kind of developer I want to be in you. I want to value performance and accessibility. I want my open source projects to be welcoming and enriching to whoever contributes. I want to learn from others what their wants and needs are for software, and to work with them to make that shared vision real.

    I just don’t see how we’ll ever be able to achieve this if all we’re doing is toiling, quietly and alone, in front of an LLM’s text prompt. I don’t know how we can ever learn what is valuable, what we should even type into that text prompt, if we hadn’t already done some of that menial labour ourselves. I don’t know how we could ever learn what matters if we don’t talk to each other, work with each other, raise each other up, etc.. etc..

    All I really want to say to you is this: don’t lose sight of what code actually is. It is a model of reality. It’s a representation of our wants and needs. You can’t make a model of reality if you don’t know what reality is, and we only have two eyes and two ears, and there’s much more to reality than we can ever observe or experience ourselves. We have to learn from each other, learn how to listen, learn how to distill all those wants down into code, into those sets of instructions for the silicon to run.

    That is the craft. And you can’t do it alone.

    Note: this is my second time trying to make this comment. Hopefully it doesn’t get eaten by wordpress again. Also, hopefully, both comments don’t appear side-by-side making me look like a double-posting dink.

    Reply

    • Nolan Lawson's avatar

      Thank you for this comment; I really hope you turn it into your own blog post or something because it is really beautifully expressed. For what it’s worth, my first experience with (collaborative) open source was PouchDB and Dale Harvey, and I’ll never forget how he welcomed anyone in the IRC server who wanted to ask questions or get advice on how to write a good pull request. I’ve tried to model all my open source projects on that ever since (with varying degrees of success), so I’m really happy to hear that you had a good experience.

      Pinafore was an experiment at building the perfect web application – no compromises on performance, accessibility, security, etc. This is part of the reason that I didn’t want to hand it off to someone else – I was afraid that they would compromise on any one of these things and ruin my vision. Of course this is the exact opposite of how most software is written today (especially in the AI age): everything is disposable, “good enough,” etc. I think we will see more of this, but I also see the potential of someone building a Pinafore by siccing enough AI agents on a small enough codebase, giving them a benchmark harness, etc.

      That said, the human element is certainly lost. Maybe eventually we will have armies of AI agents summoned by Mickey Mouse sorcerer’s-apprentice types who learn to collaborate with each other as we have. I don’t know. I do know that I feel that something has been lost, and that’s why I wrote this post.

      And BTW you did not double-comment! No worries.

      Reply

  13. Denizcan Billor's avatar

    Posted by Denizcan Billor on February 7, 2026 at 10:13 PM

    Just remember you are not the first and you wont be the last to feel their craft disappear.

    remember what it means to be an engineer:

    “An engineer is someone who turns understanding of reality into reliable, repeatable outcomes that make the world work better”

    i tried to write some of my thoughts because i see people starting to react.

    This is the best time to be a builder , and the types of things you can build… we’re just starting to see the possibilities.

    https://www.linkedin.com/feed/update/urn:li:activity:7426078758291435520/

    Reply

  14. Robert Marecek's avatar

    Posted by Robert Marecek on February 8, 2026 at 1:22 AM

    My hope is:

    „The machine will be able to do everything, it will be able to solve all the problems it is given,

    but it will never be able to invent a problem.

    (allegedly Einstein)

    Reply

  15. Derek Frederick's avatar

    Posted by Derek Frederick on February 8, 2026 at 10:56 AM

    I fully agree, but will embracing these tools even be enough to preserve our car payments, mortgages, and family security? Will copying & pasting feature requests from product be enough to warrant a 6-figure salary or even any salary at all? After a year or so of “no known incidents”, code TSA agents may no longer be seen as a necessary expense. I mourn for our craft, but I mourn even harder for my family’s future.

    Reply

  16. J P Hankinson's avatar

    Posted by J P Hankinson on February 8, 2026 at 12:34 PM

    One solution is to use the exact same approach employed by the likes of Nvidia/AMD/Intel – create ecosystems that are so proprietary and locked down with no public reference that AIs can’t help as they have no context and nothing to train on. Obviously this assumes you control said environment and ensure that none of your proprietary IP leaks and that you can create a viable business model around that technology to keep yourself employed or paid.

    I’d also say that mileage you get from AI really depends on the type of development you do. If I’m building out APIs and Web stuff using common tooling like Go/Typescript/Python etc then yes – I can generate at least 80+% of the code exclusively in AI and just spend my effort ensuring overall architecture and consistency. At the end of the day though – writing that sort of code although what 90% of the industry does, is really the bottom of the cognitive barrel – you’re really just an API glue mechanical turk – and if AI kills that – so be it.

    I bet Claude/ChatGPT and co wouldn’t be able to help you one bit with writing code for some defense-only highly NDA type systems – and for languages it is little exposed to it can downright useless.

    I still don’t quite get how people manage to vibe code whole systems with it, unless they’re so trivially non-complex as I find it routinely makes simple logic mistakes – even with the latest 5.3 or Opus models. I had one today – it decided that a 32bit dword value needed to be ANDed with bit(3) and promptly informed me that should be: someValue &= 3 (or the ASM equivalent and eax, 3) .. clearly bit(3) = 8.

    Reply

  17. fpereiro's avatar

    Posted by fpereiro on February 8, 2026 at 1:51 PM

    Thank you for writing this. I am also in mourning at the old world that seems to be slipping away in a timeframe of months.

    At the same time, it also is the birth of something new. I’m even a bit excited about it, which is weird because I’m mourning. But I will always look fondly at what we’re now leaving behind. I guess it’s now our generation’s time to know how that feels.

    Reply

    • Nolan Lawson's avatar

      Yes, I agree: a new craft is forming, which will probably share a lot of similarities with the old craft. I know a lot of people who are excited about that and have little sentimentalism for the old ways, but that’s just not me. I’m glad to know I’m not the only sentimental one. 🙂

      Reply

  18. Lens_r's avatar

    I am 23, author of Lensor Compiler Collection

    I hate AI in programming. I program for fun, as a hobby, and a robot will never replace the relationship I have with my projects.

    It’s not all oldsters!!

    Reply

  19. fantajeon's avatar

    Posted by fantajeon on February 8, 2026 at 5:28 PM

    vibe-coding → vibe-working → vibe-crafting

    From writing code for work, to working in flow, to crafting for the pure joy of it on weekends — when coding stops being a task and becomes an art.

    Reply

  20. Akop's avatar

    Posted by optimisticinquisitivelyf76c131c79 on February 8, 2026 at 10:29 PM

    I’m still not convinced that LLMs actually boost productivity; in reality, I’m seeing more and more that they just provide an illusion of productivity. I’m not delivering any faster, and my colleagues are feeling the same pain. It’s impossible to keep up with the cognitive load an LLM dumps on you after every prompt. I’ve tried everything: sub-agents, skills, OpenSpec, code reviewers (copilot and cursor bugbot: those ones are actually helpful sometimes). But none of it solves the core issue: when you receive the output, you’re forced to work top-down. You lose the ability to truly grasp the essence of the code, because you only really understand something when you build it bottom-up.

    Slop code ends up in prod, only to be rewritten manually later (even if nobody, including you, caught it during review), but we’re the ones held responsible for it. It feels like a rigged game: you’re forced to stop doing your job to manage some drunk alien who hands you work that you have to review and vouch for. Sure, juniors who have no idea what they’re doing will be ‘faster.’ But we’ve seen this before, like the hiring booms for cheap labor where devs cranking out tons of code for $500/month were considered ‘productive,’ even though the code was unmaintainable. Same thing here. At work, I’m constantly trying to rein it in, literally holding Opus 4.6 by the hand, because the moment you lose focus, disaster strikes.

    Out of curiosity, I tried to >FULLY< ‘vibe-code’ a pet project, prioritizing speed over control. It works, but the code is a nightmare. Even with a predefined architecture, the LLM ignored every instruction: DB calls in controllers, insane code duplication, side effects, and blurred responsibilities. But hey, 10k lines of code a day!
    I haven’t lost hope that this hype will die down, the circus will leave town, and we can finally get back to actual work.

    Reply

    • Nolan Lawson's avatar

      I lost hope that “the circus would leave town” (great metaphor BTW) when I saw Ralph and Gas Town and realized you could just plug more of these things into loops and solve such problems. I tried it with code correctness, DRYness, security, accessibility, performance, etc., and it worked every time. (“Look for bugs,” “run benchmarks and make it faster,” etc.) The people having a really bad time with LLMs are the ones just building features and not putting their PRs through multiple layers of agent review (yes multiple, just give it the same instructions over and over). They’re like a software company that only focuses on features and never tech debt, with predictable results. My AI tribalism post touches on this.

      Of course you’re right that you still need true code understanding even with all of the above, and maybe there’s a case to be made that you have to get your hands dirty at first just to understand the codebase, but… I dunno, I feel very much like John Henry losing to the machine at this point.

      Reply

      • bebraw's avatar

        Posted by bebraw on February 9, 2026 at 12:16 AM

        I think one of the interesting questions is what do future senior developers look like now that the focus is shifting. What kind of things should we teach for future juniors?

        My personal feeling is that you still have to understand what’s happening underneath at least to some degree (classic wisdom from Spolsky states that you should understand one level below the abstraction you are working with) so there will still be value in being able to evaluate code.

        That said, when work moves more from authoring to reviewing/curating, likely we’ll put more value on tools and technologies enabling more effective curation. It wouldn’t surprise me if programming languages that put authoring before understandability would fall out of favor for this reason. Languages with strong verifiability properties will likely grow in popularity, however.

        I think the big shift has to do with the fact that we are moving from software development to software engineering and that can be a tough shift for many.

  21. Aaron Pedersen's avatar

    Posted by Aaron Pedersen on February 9, 2026 at 1:40 AM

    Having only worked in industry for 2 years before genai become the go to. I feel a bit hopeless that I will never have the opportunity to make the mistakes and do the reps to get truly good at software engineering. The pressure is high to just ship features asap.

    Same with learning a new language. The language has been abstracted over by just prompting and making design specs.

    I would love to hear from other devs how we can go forward mastering our craft without being left behind.

    Reply

    • Nolan Lawson's avatar

      I would tell you not to lose hope. I work with a junior with less experience than you, and she is more adept with AI tools than I am – you might say “AI-native.” I watch her code, and it’s utterly alien to me how AI pervades the terminal, the IDE, the browser. She skipped a lot of the struggles that I had, sure, but 90% of those struggles might not be relevant anymore. You can “get the reps” on the skills that actually matter now (systems thinking, architecture, design, taste), and you have a hyper-intelligent buddy to guide you along the way. Keep on keeping on!

      Reply

  22. Unknown's avatar

    […] under two rules: humans must not write code, and humans must not read code. Nolan Lawson published “We Mourn Our Craft,” an elegy for programming as a human […]

    Reply

  23. […] Nolan Lawson (Hacker News, Mastodon): […]

    Reply

  24. Joshuah Rainstar's avatar

    Posted by Joshuah Rainstar on February 9, 2026 at 9:39 AM

    I grew up in the 90s and early 2000s and went to college when they still taught cobol and c++ was a horror show and python was still in version 2, and numpy and scipy and pytorch were just babies.

    While you may be nolstalgic for real code and real coders, from my perspective( and I know i’m smarter than you because my head is physically bigger than yours), programming was, and is, alien to human nature. Humans basically understand a logical sequence of tasks and decisions, and translating that into the “how” is an art, not a science. Every programming language we ever made is full of quirks that our quirky little brain obfuscates but which cost productivity and legibility. The best languages we have are those that do what they do predictably and consistently for their application and are coincidentally and not consequentially widely adopted.

    The nature of the programmer is as important as that of the programming language itself. Disciplined behaviors and rigid adherence to design principles cover a multitude of sins.

    What the nerd/geek world does not want to admit is that it is a temple, an occultic tradition of consulting oracles and arranging bones, wizardry as it were, and thus it is not strictly limited by IQ. So, you have myriad midwits coding in C and thus you have a lot of the bugs or vulnerabilities of modern computer operating systems and the software that runs on them, and you have visionary frameworks like the earlier dotnet and asp.net and anything else from that era that has now been fully obseleted yet was concocted entirely without the intervention of AI.In light of that, AI does not seem too bad. When it can rewrite our entire world in a spark-ada themed strictly typed rust-like language that’s devoid of terenary operators and requires both indentation and encapsulation for syntax flow, where legibility communicates correctness, where all important calls are themselves well-documented and well-labeled apis in library systems sweetly organized for human consumption, then we will actually be far better coders than we are even today.We will tell the machine in SenseTalk what we want it to do, and it will render us in Assembly the optimal instructions to do it.

    Your brain needs to hold the whole of what you’re tuning and working with in your mind to design it. Can most people do this? no. so they abandon complex systems because they must allocate inside budget to trivial details and because they think in terms of the hardware platform, and then their code is replaced with something a million times less efficient written by an actual child due to a corporate takeover necessitating a new framework.

    Will there still be virtue in fine-tuning, fiddling with the CPU machine language? perhaps. But the moment you’ve started using libraries that responsibility rightfully lies with the library designer, not the one who uses it. The cost-complexity of today’s programs lies in obfuscated cost, and not typically in the coding patterns in use by the developer, except in compiled languages where arrays must be manually iterated over and the structured iteration is itself hostile to vectorization, and even there, it’s mainly because people don’t THINK logically and stick branching operators inside looping constructs and don’t do a final refactor. Because people are lazy.

    So, in conclusion, there’s nothing we’re leaving behind that we truly will miss, the art will continue for those enthusiastic enough to do it, and for the rest of the world, what they need is AI that respects convention and intelligently applies optimizations. They do not need to code. They need to leave coding the hell alone. They always did, and it is because they didn’t that we’re not yet a planet-hopping galactic species.

    Reply

    • tranquilmiracleba536132b3's avatar

      Posted by tranquilmiracleba536132b3 on February 9, 2026 at 1:07 PM

      I think what you’re saying is all true to a large extent.But dude – this is an elegy. You can’t just tell people not to mourn.

      Reply

  25. MST-05's avatar

    Posted by MST-05 on February 9, 2026 at 12:49 PM

    I applaud you for writing this, Nolan.

    I, and many people I know, have wrestled with this over the past year and more. As someone who knows neural networks and understands the mathematics of the transformer model, I always had a nagging suspicion that the best model efficacy across languages would really come from programming languages – you have more tightly scoped lexical grammar, and “tone” is not really a thing so much as “best practices”. The objective functions are tighter, often binary (compile vs. no compile, test pass vs. doesn’t pass, etc.)

    Yet, I secretly hoped they wouldn’t be as good as they have become, or it would take a while longer. Because deep down, I really enjoy coding. Sure, it’s “never been most of the job” as people kind of keep saying these days like a broken record, but it’s always been incredibly stimulating for me – figuring out flow control, learning nuances of syntax, tackling a hard problem, and feeling the satisfaction of going through the friction and coming out on the other end of it feeling like you learned something. There’s so much to be said about the act of writing vs merely reading. The former has been shown to really help with learning over the latter, and there’s much to be said about that.

    These days I also probably write 70-80% of my code with Claude. I can manage the last 20% or so and still continue to meet the new speed deadlines that the industry has become acclimated to. It may be the case that I can keep up with it. For some more personal OSS I have the time for outside of work, I still like hand crafted with LLM as a judge.

    But all this to say – “mourn” is the correct word. Because even as I write this, I feel sad. I know this wasn’t the world that I asked for, especially not this fast. But here it is. I’m somewhat hopeful even if I’m not quite “excited”. I know I can continue my craft to an extent, but it won’t be what’s rewarded economically. And there’s some comfort in knowing that for what it’s worth.

    All this to say, one strange thing about this whole time-compressed paradigm shift is that it actually led me to confront my own mortality, as hilariously over-the-top as it may sound. I’m also in my 40s and my youngest most energetic years are not that far behind me. I recall a lot of the things you mentioned – the struggle of finding bugs, the satisfaction of fixing them, and also the hope of building software as a future in the aftermath of the Great Financial Crisis. Those are all so recent in my memory, and yet, they’re actually quite far away now. People will come and go. Technology will come and go even faster as they are supplanted.

    Perhaps it’s time for me to enjoy what I have, the people I’ve met along the way, and look optimistically into the future.

    Reply

    • Nolan Lawson's avatar

      Thank you for your beautiful sentiment; I share a lot of this perspective. You’re right that my thoughts did turn to mortality (“all things must pass”-type stuff) and yeah, it is because of a feeling of mortality, or the gears of history turning to replace one world with another. A little sappy, and plenty of my colleagues do not feel this way and are just excited by the new possibilities, but I’m a sentimental person by nature.

      One interesting thing about coding: I can’t recall where I read it, but someone pointed out that code is different from art, poetry, music, etc., in that good code is usually idiomatic – if you’re not sure how to write a for-loop, write it the same way that ~75% of people do it, to avoid surprises or WTFs/minute. Whereas good art/poetry/music is usually surprising – of course you can borrow motifs or themes from other works, but really we’re looking for novelty (most of the time). So maybe our field is uniquely vulnerable to LLMs. Not a comforting thought, but it would explain a lot.

      Reply

  26. zoeissleeping's avatar

    Posted by zoeissleeping on February 10, 2026 at 6:31 AM

    Really great post. I am one of those zoomer colleagues that *is* embracing AI tools with open arms. Though nothing nearly as extreme as using AI as a therapist or lover, that idea could make a whole blog post on its own, but I digress. I’ve been programming long enough to remember when code *was* written by hand (it might surprise, but I’ve been programming for literally more than a third of my entire life), and if you couldnt implement something, either you figured out how to or it didnt get written. Maybe its rose tinted lenses, but I hold those memories closely.

    It’s hard not to get caught up in the idea that everyone is “doing circles around you” with AI tools, especially when talking about a professional environment, and the truth is, either you *have* to evolve, or die. But if youre writing things for yourself, truly, for yourself, dont think too deeply about how everyone on twitter is “using claude code to be a 10x developer 🚀.” I fear the day that programming *by hand* becomes riding a horse to get around when the automobile exists, though I dont think that will be for a while.

    Truthfully, I am a cynic, and I *do* believe that AI tools will push people who refuse to change out of the industry, for better, or, more likely, for worse. As I do embrace this faster moving, ever evolving future, I too will mourn our craft along with you, and many others. Programming, for me, has defined a lot of who I am as a person, the people I met, the communities I’m in, have made me *me*, and I worry to think that culture could change, that I might be the last few to truly experience that, but the world keeps spinning, with or without my thoughts and ideas.

    Reply

  27. Andrew Murphy's avatar

    Nolan, this might be the best thing I’ve read about what it actually feels like to be a software engineer right now. The image of holding code like clay. The late-night debugging sessions. The pride of the artist’s signature on a GitHub repo. You’ve put words to something thousands of us are feeling but can’t quite articulate, and you did it without falling into either breathless techno-optimism or dismissive cynicism. That’s hard to do. Most people writing about this stuff pick a side and dig in. You just told the truth.

    Where I disagree is the ending. “The sun rises, the sun sets, I orbit helplessly around it, and my protests can’t stop it.” That’s beautiful writing but it’s also a choice to be a passenger. The knowledge we’ve built over decades doesn’t become worthless. It becomes the foundation for a different kind of value.

    I wrote a longer response to your piece through the lens of the five stages of grief, because that’s what I’m seeing play out with the people I talk to about this. Denial, anger, bargaining, depression, and then an acceptance that isn’t resignation but evolution.

    If you’re interested: The Five Stages of Losing Our Craft.

    Reply

    • Nolan Lawson's avatar

      I really enjoyed your piece, thanks for sharing. To be sure, this is an elegy, and I let myself get morose and sentimental because I was trying to give myself space to process my grief. I know exactly the kinds of senior developers you’re talking to because I was one of them (in many ways, this piece is a conversation with myself a ~year ago). In my case, I can say that trying to force certain tools on engineers from the top-down doesn’t work (it certainly didn’t for me – they just resist, same as if you forced a certain IDE on them).

      The engineers I know who are most happy with the new AI tools are the ones who, unlike me, have not accepted passivity, but have tried to master the new tools. You’re right that I’ve been a bit passive; for instance, I’ve only really tried Claude Code so far. Whereas I know engineers who say things like “I tried Claude and Codex but now I prefer OpenCode with Kimi so I’m not locked in.” Or “I tried Ralph and Beads but now I’ve built my own orchestration system and issue tracker that matches my workflow.” That’s not me, but those engineers seem much happier than me! So one of my goals this year is to learn more from them.

      My piece is also directed at another kind of engineer you don’t really mention, which is the AI holdout. These folks have strong moral or ethical qualms against GenAI. For them my “sun rises” paragraph is basically me quoting Paul Simon and saying “Who am I to blow against the wind?” In other words, the toothpaste is out of the tube, the horse is out of the barn, it’s a lost cause, etc. I get no pleasure from saying it, and plenty of these holdouts are my dear friends who strongly disagree with me, but it’s how I feel. You’re right that it’s not helpful to advocate for passivity, but I was mostly just trying to describe my own headspace in the moment.

      Reply

  28. Max's avatar

    Posted by Max on February 12, 2026 at 6:21 AM

    Personally, I don’t mind if AI makes traditional programming a thing of the past. What it does for me is enable me to write all the programs I’ve always wanted to but never had time.

    On the other hand, if by introducing this many people get laid off and companies can be run by very few I have an interesting idea on how to fix that…

    Tax companies based on one thing. The ratio of profit to the number of employees (while also guaranteeing a fair pay range). The less employees, the higher the taxes will be. If there’s a 1-man billion dollar company, well maybe we tax them 99% and they walk away with 1% of a billion a year – not bad. If they have thousands of employees maybe they get to keep 80% of that billion. I know I’m just throwing numbers around but you get the point… some way to tie people to employers the same way payroll reimbursement to companies was a better idea than just handouts to people as it kept them all connected.

    So, maybe companies will end up with hoards of people that don’t do that much and maybe some of them are paid to stay home for long periods on standby – or maybe paid to thinktank or to study and stay sharp. Maybe they shovel the driveway – who cares really as long as they get paid. And the more they get paid the less the government taxes them.

    Is that unrealistic?

    Reply

  29. Unknown's avatar
  30. […] Nolan Lawson wrote a sad but pragmatic piece about mourning the craft of software development. […]

    Reply

  31. Peter's avatar

    Posted by Peter on February 20, 2026 at 4:55 AM

    Nolan, I’m a college student, 22, on my 4th year of college, writing my undergrad thesis right now, with somewhere around 4-5 years of programming experience, living in a third-world country, with just around 6 months of industry experience (internship). And while what I’ll express here may hold little to no weight, I still want to put a little bit of my thoughts here, regardless.

    I take programming as a craft. Not just as a means to an end. And that extends well to how I approach my training, my learning, my everything. I worry about writing clean codes, I worry about how other developer will react to maintaining projects I’ve put my hands on, I try to learn multiple stuff besides just pure programming and coding, so on and so forth. Basically, I center my whole life, career, and living around this thing.

    And to see that, before I can even join the industry, AI has taken over and basically made what I do slowly “obsolete”, in a way, it’s… discouraging, so to say. It’s not that I absolutely despise AI and the capabilities they bring to the table, not really. Hell, I myself have used AI to learn, to debug, to sometimes to create stuff. And I’ve tasted the whole acceleration part, and the part where AI is just creating unnecessary friction. And at this point, I think I got a bit of a good grasp of when to use what. And usually, the first solution isn’t AI.

    But, what I despise is that, programming and tech, the act of writing, maintaining, fixing softwares, systems and everything that entails, the very things I center my whole existence around, my whole craft, the only thing I know I am good at, is now being seen as “replaceable” by AI. That we suddenly need to “evolve” into this new world, into this state of just what I take as like watching from the sidelines. I don’t want that to happen. I love being in the field. And I mourn that at this point, I may need to learn how to manage and orchestrate, rather than to write and debug.

    In a way, it’s like being a soccer team player, and being a soccer team manager. Being a player means you’re in control. You need to learn how to control the ball, how to read the opponent, how to shoot, perform tricks, all that. But now, it feels like the industry is leaning towards us being a soccer team manager, rather than a soccer team player. And for some, it might be fine. For the other some, it’s totally not fine. The skills might transfer, but the feeling will always be different.

    In the end, all I’m saying is that, I also mourn the craft. It will slowly change, slowly errode, and we need to evolve still. Like you said, our protest won’t change a thing. Although in a way, I’m not mourning what came before. I am not in the industry remotely long enough to be able to mourn what came before. I mourn what came after. The possibilities lost, the world I thought I will be in, gone in a flash.

    But on another side, I am glad to meet someone out there that take this as a craft too. It’s a rare commodity where I am now; although, maybe it is a rare commodity, no matter where you are in the world. So at least, there’s that, for me.

    So yeah. Keep on doing what you’re doing. I’m mourning this with you. We’ll just see where the world goes, I guess. Hopefully, to a place where the old and the new world, can coexist peacefully.

    Reply

    • Nolan Lawson's avatar

      Thanks for your thoughts, Peter. I really feel you. I am also someone who has historically sweated the details – meticulous on code reviews, spotting little inconsistencies and pointing them out, etc. I do feel that you’re right and somehow, now, the small details start to matter a lot less. I fear that we’re entering an era where software is disposable enough that care and attention toward the little things won’t be valued – the metaphor I would use is a home-cooked meal versus a factory where you control the proportion of ingredients, how the machines put it together, etc.

      I’m trying not to despair, though – it feels like a new craft is emerging, and it will likely have its own craftsmanship. The earliest glimmers I’m seeing of this are the people who purposefully stitch together agents of different capabilities or models in a deliberate way (this one is a code reviewer, this one is a refactorer, etc.). It’s not stirring the soup and tasting it before you decide to add more oregano, but it is the guy in the factory with a clipboard deciding how to arrange the assembly lines. I have to hope that the new craft still has something redeeming to it!

      Reply

  32. […] We mourn our craft (3min) Nolan Lawson also goes into this direction. He explains that he didn’t sign up to become a “a glorified TSA agent, reviewing code to make sure the AI didn’t smuggle something dangerous into production”. Developers must adopt these tools to stay employable, while privately mourning the lost joy, identity, and shared humanity of crafting software line by line. I can’t help to wonder if this was not the case for most innovations: people morning the hand craft. […]

    Reply

  33. Martijn Jacobs's avatar

    Hi Nolan. I wanted to thank you for writing the post as you did. I had a hard time trying to find out what I’m feeling during this fundamental change in our profession, a change which is going on everywhere around us.

    Is it because of fear? Is it because it’s going so fast? Is it because of ethical reasons? Is it because we just accept this is happening without any significant protest? Is it because of capitalist insanity?

    All these propositions are true (for me), but because of your post I figured out the real feeling I have, and that is grieve.

    So I’ll mourn with you, mourn about what made our craft so special, mourn about how our hand-written code was both artistic and efficient, for both man and machine. Mourn about an era which has come to an end. It’s a sad time.

    Reply

    • Nolan Lawson's avatar

      Thank you Martijn, I feel very similarly. It’s a lot of feelings balled up into one, but mostly it is grief. That puts things in perspective, though – there are stages of grief, and the grief eventually ends.

      Reply

  34. […] tom, jak AI v reálném čase nahrazuje white-collar pracovníky a co to znamená pro společnost. We mourn our craft – senior developer Nolan Lawson nepíše manifest odporu, ale elegii: jsme poslední generace, […]

    Reply

Leave a reply to fpereiro Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.