Joy & Curiosity #75
Interesting & joyful things from the previous week
Where’s software going? Is software… dead? Or will there be more software than we ever thought possible? Or is it going to disappear, into the agents? Or is it going to grow and grow and then truly eat the world? Who’s going to create it?
There’s few things right now that I find more fascinating than these questions. Of course, I don’t have answers and I don’t think anyone has. Guesses, sure. Theories, absolutely. Anecdotes? Here’s some.
Geoffrey Litt, standing in a hotel gym, asked Claude for a workout plan and got an app that guides him through the plan. Huh. Then Ryan Florence threw away his workout app and just asked ChatGPT’s voice mode to guide him through a workout. Where’s the software gone?
A couple weeks back I thought: maybe I should set up Clawdbot and hook it up to our shopping list in Todoist and then my wife and I can use a group chat to manage that list. We could even use voice messages: hey pal I’m in the car woops wait a second … yeah we’re out of paper towels. That’d be cool, right? But then: wait, why would I need Todoist? State could just live in that conversation or on Clawdbot’s disk, right? And then: but sometimes I do want a better UI than a group chat, don’t I? But when and why?
This week I was this close to typing something into the Slack search bar. I already had some keywords and combinations of keywords ready to go. I had already put the cursor in when I remembered that we have agg, an internal tool that Tim blind-coded and that connects to Slack and Google workspaces and whatnot, and so I asked Amp: hey, didn’t so and so say that we they migrated this thing and now we all need to? Amp via agg found it in five seconds. No keyword, no UI. Okay.
As Alex says: “It feels like a maxim is emerging - if your software is useful to agents, your product is going to be 10x more valuable than before, but if your software is built for humans, you’re dead.” And Sahil Lavigna says that “gh is the new GitHub.”
But there is still software, isn’t there? I’m typing this through software. And I had Amp create many hundreds of lines of personal software for me, but that software is so personal that I won’t release it, because why bother? The cost of generalizing it is higher than the cost of creating it. So you won’t ever see it. Invisible software.
Say that I do release some software that took me an hour to create. Or let’s say six hours. A small useful app, with some heft to it. You know what I mean. A good workout tracker. Or a little menu bar app. Or a browser extension. Say I sell it for $5. Won’t a hundred competitors be able to recreate what I did in thirty minutes? Prices will go to zero. Why bother?
Last anecdote. I’ve been meaning to create a little booklet. A physical thing, printed professionally. Weeks ago I had Nano Banana and ChatGPT tag-teaming and they created the logo that’d go on the front. Then work stalled because I couldn’t be bothered to look up the dimensions the print company needs and CMYK and PDFs and all of that and ugh, please. So I sent exactly that to ChatGPT: here’s the URL of the product description, here’s the logo in 4 formats, here’s the mockup someone (wink wink) created, please help me man. It ran for 15, 20 minutes and gave me a PDF. I uploaded it on the printer’s website, following the 6 steps ChatGPT outlined for me, got an error, told ChatGPT about the error also asked for some adjustments, got a new PDF, uploaded it, got the green checkmark, put my credit card in and now the booklet’s on its way.
I then checked what ChatGPT did, in agent mode, and turns ou: it wrote a lot of code. It essentially created the PDF I needed by writing Python. Many, many lines of Python. And now they’re gone and no one would’ve seen then if I hadn’t looked.
So, where’s the software going?
We at Amp think the coding agent is dead. Or maybe we should’ve said it’s solved. Or that the text editor is dead. Point being: what we have right now isn’t the future. There’s more to build. And this is the model that made us realize it: GPT-5.3-Codex.
Don’t believe us? Say it to our face. Most of the Amp team is in Singapore this week. Join us on Thursday. (I’m writing this at the airport.)
Harness engineering: leveraging Codex in an agent-first world, on the OpenAI blog. This is some of the best writing on agents hitting the real world and where this ride is going. You should read the whole thing, but this bit in particular stayed with me: “As Codex’s throughput increased, many conventional engineering norms became counterproductive. The repository operates with minimal blocking merge gates. Pull requests are short-lived. Test flakes are often addressed with follow-up runs rather than blocking progress indefinitely. In a system where agent throughput far exceeds human attention, corrections are cheap, and waiting is expensive.”
How will OpenAI compete? by Benedict Evans. Great, as always.
Chris Lattner took a close look at the C compiler produced by Claude Code. I have to admit that I started reading with the expectation that it’s going to be about the compiler internals and what the AI got right and what it got wrong. And yes, that’s in there, but there’s more: thoughts about the AI in general, about IP law, about the shifting role of software engineers, about AI use at Modular. “Lower barriers to implementation do not reduce the importance of engineers; instead, they elevate the importance of vision, judgment, and taste. When creation becomes easier, deciding what is worth creating becomes the harder problem. AI accelerates execution, but meaning, direction, and responsibility remain fundamentally human.”
Entertaining and interesting: How does Docusign have 7,000 employees?
Can Opus 4.6 do Category Theory in Lean? You know me: I don’t understand any of the formulas in there and when I read “endofunctor” I do that Homer Simpson stare, but still (or because?) I found this very fascinating. “When this layer becomes trivial, we get to spend our time on the parts that actually matter: choosing the right abstractions, seeing the connections between structures, deciding what’s worth formalizing in the first place. The proof assistant becomes less of a bureaucratic obstacle and more of a genuine thinking tool. We get to build higher.” When category theory and formal specification languages become mainstream due to AI, call me.
AI fatigue is real and nobody talks about it: “When each task takes less time, you don’t do fewer tasks. You do more tasks. Your capacity appears to expand, so the work expands to fill it. And then some. Your manager sees you shipping faster, so the expectations adjust. You see yourself shipping faster, so your own expectations adjust. The baseline moves. Before AI, I might spend a full day on one design problem. I’d sketch on paper, think in the shower, go for a walk, come back with clarity. The pace was slow but the cognitive load was manageable. One problem. One day. Deep focus.” I find this very fascinating to think about, because it’s true, isn’t it? Back in the olden days, say in 2024, you could have a full day of programming in which you did nothing but program and yet there would still be moments of mindless execution that let you recover from moments of high concentration and focus. Now, with the mindless execution being done by the mindless, there’s nothing left to act as a buffer between the intense moments. Except maybe distraction.
A “a Matt Levine style explanation of how OAuth works” given by Blaine, who, 19 years ago, “wrote the first sketch of an OAuth specification”. We need more explanations like this!
This post has a lot of really interesting thoughts on where software as a business is going. This part here, on building financial software, is illustrative of some trends, I think: “Zero custom parsers. Zero industry-specific classifiers. Why? Because frontier models already know how to navigate a 10-K. They know that Home Depot’s ticker is HD. […] Frontier models already know how to parse SEC filings from their training data. They understand the structure of a 10-K, where to find revenue recognition policies, how to reconcile GAAP and non-GAAP figures. You don’t need to build a parser. The model IS the parser. Feed it a 10-K and it can answer any question about it. […] The data isn’t worthless. But the ‘making it searchable’ layer, which is where a lot of the value and pricing power lived, is collapsing.” Replace ‘searchable’ with other abilities and you see how it applies to more than just software to navigate SEC filings. And then, of course, there’s competition: “The critical insight is that competition doesn’t increase linearly—it explodes combinatorially. You don’t go from 3 incumbents to 4. You go from 3 to 300. And that’s what craters pricing power. Before LLMs, each vertical had 2-3 dominant players commanding premium prices because the barriers to entry were insurmountable. That math changes completely when 50 AI-native startups can offer 80% of the capability at 20% of the price.” We already had five thousand TODO apps. What’s the next category of software in which there’ll be five thousand alternatives, selling for $5.99?
Similarly: “if your product isn’t a system of record that ai tools can be built on top of, you’re increasingly hard to justify keeping“ But then the question is: how hard is it to reproduce that system of record? Todoist: easy. Your company’s pay slips? Hard. Analytics? Performance data? Monitoring? Errors? Tickets?
Sean Goedecke compared how the recently released “fast” modes by OpenAI and Anthropic differ: Two different tricks for fast LLM inference. Interesting stuff, especially since he now collected and responded to some of the comments he got.
I Sold Out for $20 a Month and All I Got Was This Perfectly Generated Terraform. This is some real stuff — some true stuff. I love the honesty and the humility. I love the “band of Eastern European programmers who chain smoke during calls and whose motto is basically ‘we never miss a deadline’” and I love this part here: “I also just have trouble with the idea that this is my career and the thing I spend my limited time on earth doing and the quality of it doesn’t matter. I delight in craftsmanship when I encounter it in almost any discipline. I love it when you walk into an old house and see all the hand crafted details everywhere that don’t make economic sense but still look beautiful. I adore when someone has carefully selected the perfect font to match something. […] When I asked my EVE friend about it on a recent TeamSpeak session, he was quiet for awhile. I thought that maybe my moral dilemma had shocked him into silence. Then he said, ‘You know what the difference is between you and me? I know I’m a mercenary. You thought you were an artist. We’re both guys who type for money.’“
15,597 tok/s. Holy shit. And here’s how they did it. Are there any physical or theoretical limits that would stop someone from doing the same for, say, GPT-5.3-Codex in a few years?
The Only Moat Left Is Money: “The effort is gone. Effort was the filter. I launched something last week. 14 people signed up — no ads, just a couple of posts. 14 real people who didn’t have to. That number is tiny and it felt like something. Then I sat down to think about what it would take to grow it and I couldn’t look at that math for very long. The people winning mostly had a head start. Or they have money. Usually both. When creation was hard, skill was the differentiator: you had to actually be good to make something worth showing. Now the barrier is near zero, so you need reach. Reach costs money or it costs years. Probably both.” I’m not sure I believe that effort doesn’t count anymore, but the game is changing, which is fascinating and scary and exciting and crazy.
Child’s Play, subtitled: tech’s new generation and the end of thinking. Excellent.
The Software Development Lifecycle Is Dead. Not too sure about the specifics, but you know me: I agree.
“I built an agent for researching, coding, and running generative art animations for 16-segment displays. Will open source code and hardware design files soon.”
Andy Coenen, who built the wonderful isometric nyc, on The Software Industrial Revolution. It’s very, very good. To pick just one of the parts worth picking: “The old golden age is over, and it ain’t coming back - no more ‘rest and vest’, no more ping-pong offsites and five-star catered lunches. But a new ‘golden age’ is coming - no more nights staring red-eyed at empty stack overflow issues, no more weeks of alignment meetings to ship a prototype. I believe it’s never been a better time to build - not just software but anything you can dream of. The world is yours if you embrace this new reality and learn how to really use these tools”. The other part worth mentioning is the one about “personal apps”: sure, yes, grandma won’t use AI to write her own sudoku app, but, as Andy suggests here, there are so many other people — professionals! — who sure would love to build better research tools for themselves.
I’m pretty sure this just changed how I think about intelligence: why aren’t smart people happier? (That little “what if you booted up an AI in ancient Greece?” thought experiment is fun too.)
“Two old engineers were talking of their lives and boasting of their greatest projects. One of the engineers explained how he had designed the largest bridge ever made.”
Robin Sloan on how far AI can expand: flood fill vs. the magic circle. Interesting to think about, but I can’t help but wonder: does it matter that AI can’t touch the physical world, when your career is 99% digital and you’re looking at a screen a lot?
Jason Fried was on the David Senra podcast. What a perspective this guy has. Inspiring.
I’ve seen many, many, many stand-up specials over the years, because I enjoy stand-up comedy a lot and very earnestly believe it’s one of the highest art forms we humans have created. Yes, I’m serious. I’m German. If there’s one thing I don’t joke around about it’s comedy. But a stand-up special that makes me actually laugh out loud is a rare one. Kevin Nealon’s latest special Loose in the Crotch did that. I nearly spit out food. God damn did I fall in love with that special. I’ve watched it twice since Tuesday. I know it’s not everyone’s cup of tea and if you don’t like it you should keep that to yourself. But let me know if you do.
“This Fab Faux recording of most of side two of Abbey Road is a live, in the studio performance for a two camera video shoot. In the end, there were only three minor guitar fixes and each section was recorded in no more than three takes (most were two). There are NO added overdubs within this performance. The audio is pure.” Uploaded fourteen years ago. I think I started watching this video in 2010, when it was uploaded to Vimeo. Treat yourself.


