A cost analysis estimates OpenAI spends ~$65/month in compute per ChatGPT Plus subscriber, driven largely by Sora video generation.
An independent analyst published a cost breakdown estimating that each $20/month ChatGPT Plus subscriber costs OpenAI roughly $65 in compute — a $45/user monthly loss. The math centers on Sora's video generation costs, which are compute-intensive enough to make the product economically unsustainable at current pricing. The analysis uses publicly available GPU pricing and inference cost estimates. OpenAI has not confirmed or disputed the figures.
Video generation is compute-disproportionate in a way text inference is not. If you're building on Sora's API or planning any diffusion-based video feature, your cost model is probably wrong by an order of magnitude. This analysis quantifies what many have hand-waved — video tokens are brutally expensive per second of output.
Run a real cost estimate for any video generation feature you're scoping: call the Sora or Runway API with a 5-second clip request, log the response time and token count, then multiply against your expected monthly usage volume to get a hard monthly burn number before writing a line of product code.
Run: curl https://api.openai.com/v1/video/generations -H 'Authorization: Bearer $OPENAI_KEY' -d '{"model": "sora", "prompt": "a calm ocean wave at sunset", "duration": 5}'
Tags
Signals by role
Also today
Tools mentioned