
Mini Mills, DeepSeek, and Amazon’s Data Centers
In 1969, Nucor opened its first mini steel mill in Darlington, SC. Its small electric furnaces produced steel from scrap cheaper and faster than the big integrated mills’ blast furnaces could from iron ore. Nucor disrupted the steel industry. Today, it’s the largest and most diversified steel company in North America, with a $45 billion market cap. Mini mills now account for 70% of all steel produced in the U.S. Former industry giant U.S. Steel has only two integrated mills left, a market cap of $8 billion, and is seeking to be acquired. Sic transit gloria.
Will Chinese company DeepSeek disrupt the giant artificial intelligence (AI) models the way Nucor disrupted the giant integrated steel mills? What could that mean for Mississippi? Here’s what: Giant AI models are driving the growth of data centers. If DeepSeek disrupts their competitive advantage, the growth of data centers may stall. That would be bad for Mississippi.
Amazon is spending $10 billion to build two giant data centers near Canton. Compass Datacenters recently announced it will spend $10 billion to build hyperscale data centers in Meridian. Other companies could build more data centers here to access electricity generated from natural gas pipelines crossing the state. Data centers require vast amounts of electricity to provide computing power for giant AI models. The more giant AI models there are, the more data centers are needed.
The $20 billion in data center investments represents a significant stimulus for Mississippi’s economy. Entergy is investing an additional $2+ billion to provide electricity for Amazon’s data centers. Mississippi Power says it will supply 500 MW of power to Compass (estimated cost $500 million). This money circulates through the economy and multiplies. Some of it stays here—and that’s good for Mississippi.
OpenAI is a leading developer of giant AI models. Microsoft and others have invested approximately $100 billion in its development since 2016. OpenAI’s ChatGPT is a popular app (I used it, DeepSeek, and other AI apps to get information for this article). Major players in giant AI models include Google’s DeepMind, Anthropic, Meta AI, Cohere, IBM’s Watsonx AI, and many others. Development costs for these AI giants may be in the $trillions. The President’s Stargate initiative proposes another $500 billion to ensure U.S. AI supremacy because of its perceived strategic value.
Giant AI models rely on special chips made by Nvidia. The U.S. has banned the sale of these chips to China. We smugly thought our technology and capital gave us an unassailable lead in AI—until last month when DeepSeek arrived on the scene. It jumped the technology and capital moats.
Sam Altman, cofounder of OpenAI, called DeepSeek a “Sputnik moment,” signaling a shift in the global AI race just as Sputnik did during the space race with Russia in 1957. DeepSeek highlights China’s growing competitiveness in AI. According to Altman, DeepSeek’s R1 model demonstrates impressive efficiency and achieves results comparable to OpenAI’s models at a fraction of the cost and computational resources. Altman said OpenAI must innovate and adapt its strategy accordingly.
DeepSeek was developed in Hangzhou, China, by Liang Wenfeng, a 2007 graduate of Zhejiang University. His company released its first AI model in November 2023 and became globally famous with the release of its R1 reasoning model shortly thereafter. The company claims it cost less than $6 million to develop! It is competitive for many applications with large language AI models that cost billions to develop—and it’s open source and free.
DeepSeek topped Apple’s App Store chart just days after its release. It caused a stock market dip led by tech icon Nvidia, which lost $600 billion in market cap on January 27—the largest single-day loss ever recorded.\
How did DeepSeek achieve this? According to Informa TechTarget, its secret sauce is its model’s unique training approaches: Reinforcement Learning, Reward Engineering, Distillation, and Emergent Behavior Networks. These techniques are well-known in the AI world. Distillation may be key. It uses large AI models to train smaller, simpler ones. Critics say DeepSeek used large AI models developed by others to train its smaller models that focus on specific tasks and are easier to use—in effect accusing DeepSeek of stealing from larger models.
The New York Times and Emmerich Newspapers (Northside Sun) might say this is “the pot calling the kettle black.” Large language AI models train on copyrighted articles from newspapers and other publications without compensating their publishers. The New York Times is suing Microsoft and OpenAI for using its archives to train large AI models without permission or payment. Closer to home, Emmerich Newspapers is suing Japanese and Chinese companies for theft of copyright-protected content. Competition in this brave new digital world ain’t beanbag—fortunes are at stake, as is energy security (i.e., affordable and reliable electricity).
Speaking of affordable and reliable electricity: Data centers are not always a win-win for all Entergy and Mississippi Power customers. Entergy’s $2+ billion expenditure for a new power plant, transmission lines, and other infrastructure for Amazon’s data centers will likely raise rates for residential customers. That’s happened in other states with data center projects.
That happens because residential customers often pay part of the cost for power plants, transmission lines, and other infrastructure for data centers —even if they don’t directly benefit from them. Expedited data center project timelines increase costs further. Normally, the Mississippi Public Service Commission (PSC) determines a fair cap for residential customer cost-sharing; however, Mississippi’s Governor and legislature cut a secret deal exempting Amazon’s data center projects from normal PSC prudence reviews. If Compass’s data centers in Meridian weren’t given similar exemptions, Mississippi Power residential customers might catch a break.
Is there a giant AI bubble about to pop? Is DeepSeek today’s Nucor? Are large language models today’s equivalent of steel scrap—for DeepSeek to convert into mini models? Time will tell which models will be left standing.