
ChatGPT has received both good and bad press since its inception, and has even been banned in some places. But have you ever wondered how much it costs to run a sophisticated AI-powered bot of this scale? Well, a research firm called SemiAnalysis did the math.
The firm estimates that it takes about $700,000 per day to run ChatGPT, which breaks down to about 36 cents per query. And most of the costs go into the hardware infrastructure needed to run AI systems. SemiAnalysis writes a Blog post:
Estimating the costs of ChatGPT is a difficult proposition due to the many unknown variables. We created a cost model that shows ChatGPT costs $694,444 per day in hardware compute costs. OpenAI requires ~3,617 HGX A100 servers (28,936 GPUs) to serve Chat GPT.
ChatGPT is one of the fastest growing technologies. Accumulated more than 100 million active users. In January, just two months after its launch. For reference, it took TikTok nine months and Instagram 2.5 years to achieve the same milestone. Dylan Patel, chief analyst at SemiAnalysis, told Insider that this initial estimate is based on the GPT-3 and the new model GPT-4 could be even more expensive to run.
the information (paywalled) recently reported that Microsoft, a prominent supporter of OpenAI, is working on a dedicated AI chip that is expected to reduce the cost of operations. OpenAI also launched its $20 premium subscription chat GPT Plus earlier this year to make some money.
Semi-analysis They say that if Chat GPT The model is used to power Google’s existing search business, It will do Eat $36 billion. I from the company’s profits”LL.M Estimated Cost” alone.
“Currently deploying ChatGPT to every search performed by Google would require 512,820.51 A100 HGX servers with a total of 4,102,568 A100 GPUs. The total cost of these servers and networking is over $100 billion in Capex alone. , in which Nvidia will receive a large share,” the firm writes.
through internal, Digital trends