I’ve had two Peak San Francisco moments this week:
I saw four Waymos get stuck at an intersection, after a human-driven bus got stuck there. The Waymos didn’t dare to cross and just inched forwards and backwards, while the shouting from the humans waiting for them got louder and louder. Then a taxi driver got out of his car, stood in front of the Waymos, and waved them across the street, shouting “go, Waymo, go!” It worked. The waved-at Waymo got its courage together and started driving. (And to answer two questions you might have: he didn’t have a special wave, he waved like you’d wave at a cow if you want it to move, if you can imagine that, city people; no, I do not know whether a human driver didn’t take over remotely.)
Being in Sourcegraph office, in the middle of San Francisco, shipping Amp releases, while watching the GPT-5 release livestream, turning internal model codenames into the official ones, waiting for the public API to be accessible.
I’ll be in SF until Friday. Let’s see what this week will bring.
This article here, about “how we built Bluey’s world”, containing “tales from original series art director, Catriona Drummond”, is probably the most Joy & Curiosity entry this week. It’s… lovely. My kids love Bluey, I love Bluey, I think it’s an incredibly well-made, charming, joyful show that’s full of heart and when reading the article I went “ah, yes, that explains it.” The references she pulls out make you realize, again, how much any of it comes down to influences; the parts about lighting are impressive; the thing about the kettle is so obvious in hindsight but gave me so many ideas for how to direct ideas in the future… Highly recommend you read it.
“It's a physical display sitting in my office in Wisconsin, USA. Each pixel is a cube of wood, painted on two sides, that rotates to be on or off. There are 40 columns and 25 rows, for a total of 1000 pixels - thus the name, kilopixel.”
Cynthia Dunlop, author of Writing For Developers: Blogs That Get Read, interviewed me, the guy writing this sentence, about writing blogs, which I took to mean “you can also talk about your newsletter”. You can read it here: Thorsten Ball on Technical Blogging. I’m usually not a big procrastinator (too much guilt) but I kept Cynthia waiting and waiting until I finally sat down one morning and wrote the whole thing down in an hour and, to my surprise, this was actually a lot of fun to write. I like writing (well, long story) and I like writing about writing.
I shipped an orb in the Amp CLI. I love it. I’ve stared at it for… well, no need to use numbers here. I also love the word: orb. Orb, orb, orb. Still, I was surprised by how many other people loved it. Sure, who wouldn’t love an orb? That’s what you tell yourself, but when someone else says “I love the orb” you still are surprised. (Get your orb now.)
As a response to the orb, someone then linked to this: Donut math: how donut.c works. I can’t find who linked to it where, but: very neat.
Did you know that you can talk to an LLM via DNS? Like this: “dig @ch.at ‘what is golang’ TXT +short” I didn’t. Very cool. They also allow you to use curl and SSH, very neat: ch.at.
Simon Willison’s write-up on the GPT-5 release is good and gives a better overview than most of the official OpenAI documentation (which does contain all this information, but not on a single page.)
We also posted something: Model Evaluation. It contains some first impressions of GPT-5 as a coding agent from yours truly.
The GPT-5 release created at least one very, very good meme: vibechart.
I can’t comment on the math, because it goes over my head, but the writing kept me reading the thing until the end: Attention Is Off By One. “This is what’s been happening in LLMs – for reasons that are only partially understood, Transformer models contain these outlier weights and are emitting Black Swan mega-activations that are much, much, much larger, like orders of magnitude larger, than their peers. But no one can get rid of them; the megalodons seem to be critical to the operation of these models, and their existence is contrary to everything we thought we knew about neural networks prior to building ones that worked so well.”
OpenAI released open weight models this week. Here’s Simon Willison’s very good writeup. I used Amp to build an agent for gpt-oss-20b, to see what it can do. It’s based on my How to Build an Agent post and I didn’t write a single line by hand. It works, but I haven’t really tested it (busy week). What I can say is that the new Harmony format from OpenAI is a bit confusing. Maybe I need to spend more time with it, but it felt weird having to use libraries to parse the agent’s response (or writing my own, of course.) The agent is on GitHub, if you want to take a look.
Andy Hertzfeld took notes at Alan Kay's talk at Creative Think seminar, July 20, 1982 and it has some good lines in it: “Turn up your nose at good ideas. You must work on great ideas, not good ones.” (this is the line that made me find the URL), “Better is the enemy of best”, “content over form, go for fun.”
I’ve been on the Internet since, I think, 1997. I’ve had unsupervised access to the very same Internet since the year 2000. I’ve seen more things on the Internet that I can remember and some of it I very much shouldn’t have seen. And yet. And yet this made me go “uhhhh what”: “It’s not discussed publically very often, but the main use-case for fine-tuning small language models is for erotic role-play, and there’s a serious demand. Any small online community for people who run local models is at least 50% perverts.”
Missed this in July (ages ago) but, man, this is so interesting and so inside baseball: windsurf gets margin called.
I also missed this, in 2020, when Alexandr Wang wrote: Hire people who give a shit. It’s easy to dismiss, but I don’t think one should. I like this post. Hiring is very hard and hiring for “who gives a shit” seems, in hindsight, when looking back at a lot of hiring decisions, like an obvious-but-very-hard-to-pull-off-but-good-if-you-can idea.
Apple released Embedding Atlas. I now want to play around with it to see whether you can see the “king - queen = man - woman” equation.
“That understands how capitalism doles out prizes for visibility and speed—even when those very forces stifle the creative process and are at odds with what it takes to make something daring, unruly, and true”
Attention Is Off By One is from 2023 btw.