I'd say it isn't, but I'd be saying it about the existing applications (and near-future stuff that currently sorta-kinda works), not the tech-bro delusions that wind up on billboards and cause aging suits to both sweat and drool.
AI is the kind of bubble where the tech involved is halfway to magic. And it'll run on your local hardware. No matter how hard the offices-and-capital side crashes and burns, that's not going anywhere.
Right now is the worst that image AI will ever be again.
LLMs might stumble, because the big-iron approach seems to make a difference for them, but there are local versions and they do roughly the same things. That's going into video games, for a start, and probably turning every single NPC into a verbal chatbot that can almost hold a conversation.
The universe of low-stakes, high-dollar applications for AI is so small that I can’t think of anything that belongs in it.
... entertainment is a bajillion-dollar industry where abject failure is routine and hilarious. There will be a boom like there was for CGI, only with even worse treatment of the tiny companies performing miracles. Workers getting screwed while revenue floods in is not the same thing as a bubble bursting. Unfortunately.
I'm disappointed in Doctorow for asserting this technology will remain big and complex and expensive. When has that ever stayed true? Saying it'll always take forest-eating big iron sounds like predicting computers will only be affordable to the five richest kings of Europe. This whole neural-network revival kicked off because consumer hardware made training feasible. More training equals better models, and more computers equals more training, but Google's still pouring gigawatts into glorified video-game tech.
If all the creative and academic zeal gets left working with mundane single machines - guess where all the advancements will happen.