Follow-up on AI Value vs Costs
In response to yesterday's post, a reader pointed out that the cost equation for our hypothetical cartoon-AI product is even more naïve than initially discussed:
As a consumer with a $20/month ChatGPT subscription, OpenAI is heavily subsidizing me. However, any scaling business will have to pay for API access and tokens or face usage limits. This means that once I start churning out cartoons by the thousands, ChatGPT won't let me do that with my cheapo subscription...
This is true for AI products in general, and it will be interesting to see how the pricing dynamics evolve. With traditional SaaS, we've grown used to certain price anchors for consumer versus business tiers: There's the free plan. There's some entry plan for ~ $9/month, a slightly higher tier for $19/month and then the business and enterprise plans for, say, $99 and up. It's safe to assume that database, storage, and compute costs per user are well below those price points, leaving ample profit margin.
Querying an LLM for every user request will significantly impact these pricing dynamics. Costs are falling, but are they falling fast enough? If not, any company whose product relies heavily on AI will have to charge accordingly (Like OpenAI's $200/mo pro plan...), and it's unclear whether consumers will accept that.
Going back to market risk, we need to be damn sure that we're solving a problem that's painful enough to justify the price we'll need to charge.