There was a time when making a phone call cost money. Making a long distance call cost even more money. It doesn’t anymore and I think the cost of geneartive AI will follow the same trend.
AI startups are all losing money, but that’s not the whole story. One of the surprising statements I’ve been from the frontier AI labs is that their inference is generally profitable.
Sam Altman says "We're profitable on inference".

Jensen Huang in a podcast interview:
… I’m so pleased that these tokens are now profitable, that people are generating, I heard somebody, or heard today that open evidence, speaking of them, 90% gross margins. I mean, those are very profitable tokens. […] Cursor, their margins are great. Claude's margins are great. For the enterprise use of Open AI, their margins are great.
Training a model costs an staggering amount of money, but inference — generating content using that model — appears to be profitable much of the time. On top of this the cost of tokens is going down rapidly. Newer models are better, but also cheaper to use.

Token usage is going up, however. More advanced reasoning models use a lot more tokens than the final output you see.
But there are also alternative models that are even cheaper to use than the frontier models. Small models can be trained, either from scratch or distilled from bigger, more sophisticated models, and tailored to specific tasks. These small models won’t compete with ChatGPT or Claude but can be very useful on tasks with constraints. Meta has a custom model they use for generating ads. It would not be very useful for writing code but excels at generating text for ads. Meta runs this in their own data centres so I expect it is far cheaper than what they would pay to Google or OpenAI for a similar service.
Most companies won’t be able to do that yet but I believe we will see smaller models trained for specific tasks as the cost comes down and the infrastructure emerges to do this more cheaply.
I keep seeing and hearing people say the AI labs are losing money, so this can't last. I think they are pattern matching against startups like Uber and Doordash. They were initially in an attempt to gain market share and under cut the incumbent taxis. They have since raised prices, so there is some expectation the AI startups will, too. Maybe, they will but the delivery startups needed human bodies to drive things around. The AI labs are constrained on electricity and chips which is a much better problem to have. You can scale those up eventually. At the same time the efficiency of the hardware and software can improve.
If you’re assuming generative AI costs will rise, consider what it would look if they become cheaper, or even free.