Skip Navigation

ChatGPT In Trouble: OpenAI may go bankrupt by 2024, AI bot costs company $700,000 every day

TechNews @radiation.party

OpenAI may go bankrupt by 2024, AI bot costs company $700k every day

7 0
207 comments
  • If ChatGPT only costs $700k to run per day and they have a $10b war-chest, assuming there were no other overhead/development costs, OpenAI could run ChatGPT for 39 years. I'm not saying the premise of the article is flawed, but seeing as those are the only 2 relevant data points that they presented in this (honestly poorly written) article, I'm more than a little dubious.

    But, as a thought experiment, let's say there's some truth to the claim that they're burning through their stack of money in just one year. If things get too dire, Microsoft will just buy 51% or more of OpenAI (they're going to be at 49% anyway after the $10b deal), take controlling interest, and figure out a way to make it profitable.

    What's most likely going to happen is OpenAI is going to continue finding ways to cut costs like caching common query responses for free users (and possibly even entire conversations, assuming they get some common follow-up responses). They'll likely iterate on their infrastructure and cut costs for running new queries. Then they'll charge enough for their APIs to start making a lot of money. Needless to say, I do not see OpenAI going bankrupt next year. I think they're going to be profitable within 5-10 years. Microsoft is not dumb and they will not let OpenAI fail.

  • because i distrust this kind of technology in general and for sure it would add to the dystopian, anti-consumer, anti-workforce agenda big tech is currently enforcing. i work in desktop publishing and about 3/4 of jobs in that branche would be cancelled the moment ai could replace them for a fraction of the cost.

  • The thing about all GPT models is that they’re based on the frequency of the word to determine its usage. Which means the only way to get good results is if it's running on cutting edge equipment designed specifically for that job, while being almost a TB in size. Meanwhile, Diffusion models are only GB and run on the GPU but still produce masterpieces because they already know what that word is associated with.

  • Would help if they would offer more payment options than just credit card, which is not really popular in many countries

  • Sounds like poor management. I don't see how that is any one else's problem though.

207 comments