DeepSeek: The best ChatGPT alternative or a hotbed of dubious claims?

DeepSeek whale logo on a geometric background.
(Image credit: DeepSeek)

The internet seemingly has a new favorite AI, and it's not the latest ChatGPT model from industry untouchables OpenAI.

Soaring to the top of Apple's App Store, Chinese artificial intelligence chatbot DeepSeek has now become the top-rated free app for productivity after a groundswell in popularity following the release of the DeepSeek-R1 "reasoning" model on January 20, overtaking OpenAI's ChatGPT in the process.

Beyond App Store leaderboards, claims surrounding DeepSeek's development and capabilities may be even more impressive. The company behind the LLM (Large Language Model) claims it cost less than $6 million to train its DeepSeek-V3 model and used limited hardware compared to its American contemporaries while achieving similar results.

However, while DeepSeek is proving popular with users and developers alike, mainly thanks to its favorable API pricing, all that glitters isn't gold when it comes to this app, and an air of controversy undercuts an otherwise successful launch of two highly capable AI models.

DeepSeek: What is DeepSeek?

DeepSeek was founded by Liang Wenfeng, a Chinese entrepreneur and co-founder of the High-Flyer hedge fund based in Hangzhou, Zhejiang, China. Originally, DeepSeek was intended to be an AGI (Artificial General Intelligence) research wing of High-Flyer, which has exclusively used AI in trading algorithms since 2021. However, since May 2023, DeepSeek has stood as its own company, with High-Flyer becoming one of its primary investors.

The company's DeepSeek LLM (Large Language Model) debuted in November 2023 as the open-source DeepSeek Coder and was followed by DeepSeek-V2 in May 2024. The company launched its latest DeepSeek-V3 model in December 2024 and has since seen a swell of popularity, with its mobile app racking up over 1.6 million downloads.

While the DeepSeek LLM is mainly similar to other popular chatbots like Google Gemini or ChatGPT, the app's free-to-use models are proving popular with users, and its developer-friendly API pricing is pushing it to the forefront of discussion.

DeepSeek: Why is it important?

The old myth goes that during the space race of the 1960s, NASA spent millions in taxpayer dollars on developing a space pen after it realized that ball-point alternatives were incapable of writing in the zero-gravity environment of space. Meanwhile, their cosmonaut counterparts avoided such costs and headaches by simply using a pencil.

While none of that is true, it's a parable of thriftiness and practicality that makes for an excellent story.

However, mirroring the legend of the space pen, DeepSeek has seemingly managed to pull off a similar feat in cost-effectiveness and practicality through the development of its DeepSeek-V3 model, which it claims to have trained for less than $6 million, a fraction of the hundreds of millions spent by other companies pursuing similar outcomes (while achieving comparable levels of performance).

Not only that, but DeepSeek's recent release of its DeepSeek-R1 "reasoning" model is designed to simulate logical thought by sacrificing the speed of a response for a more well-reasoned answer. It can achieve results equal to (if not better than) OpenAI's own "reasoning" model, GPT-o1 — even as the company claims to be hamstrung by U.S. export restrictions on more powerful Nvidia GPUs.

DeepSeek: How much does it cost?

DeepSeek is free to use online via its web portal or on mobile (with both Android and iOS apps available).

However, the impact that DeepSeek's emergence will have on the cost of AI for businesses, developers, and more could be most groundbreaking, with the company's API price model blowing the competition out of the water.

While OpenAI currently charges $15 per million tokens (a unit of data that prompts are broken down into during the generation of a model's response), DeepSeek costs only 55 cents per million tokens, a phenomenal drop in charges for API users of up to 96 percent.

This cost difference could be game-changing for many professional users involved with AI and poses a significant risk to OpenAI's potential income, with DeepSeek potentially now forcing the hands of other companies to lower their prices to remain competitive.

DeepSeek's impact is already being felt in the markets. Several semiconductor names are feeling the hit, including Nvidia. Following the release of DeepSeek's latest models on Monday, pre-market trading dropped 13.8%, threatening to wipe out almost $500 billion from the company's trading cap. However, it has since climbed back to 11%.

DeepSeek: Controversy

If everything DeepSeek has to offer sounds too good to be true, that's potentially because some of DeepSeek's claims may be just that.

The performance of DeepSeek's V3 and R1 models cannot be disputed. Still, many questions remain about the company's actual pricing, its use of hardware, the cost of its training, and the sourcing of its training data.

The latter has already been the subject of some controversy. Several users reported that DeepSeek V3 would refer to itself as ChatGPT, potentially indicating that this model was trained on public data sets generated by OpenAI's GPT-4 model.

Speaking to TechCrunch, Mike Cook, a research fellow at King’s College London specializing in AI, backed these claims, stating, "Obviously, the model is seeing raw responses from ChatGPT."

Cook highlights that this may not be an intentional action by DeepSeek but also points out that the practice of training models on data generated by other models can be "very bad," likening it to "taking a photocopy of a photocopy" in the sense that the quality of outputs will degrade each time.

It's also possible that by adopting generated training data, DeepSeek will inherit any of the same biases of the original model, adding to the chatbot's own biases, which enforce strict censorship by law of anti-Communist Party of China (CCP) narratives, including the events of the Tiananmen Square incident of 1989, Hong Kong protests, the ownership of Taiwan, China's treatment of the Uighur people, or the occupation of Tibet.

This form of censorship only degrades trust in the platform, and founder Liang Wenfeng's ties to the CCP only heighten concerns about how user data may be used or how Chinese authorities could misappropriate the platform in the future.

Writing for Biometric Update, Anthony Kimery, former Editor-in-Chief and co-founder of Homeland Security Today, highlighted how the platform could "support disinformation campaigns aimed at destabilizing U.S. institutions."

DeepSeek's claims that it developed its models on less advanced hardware are also being questioned. Citi analyst Atif Malik states, "While DeepSeek's achievement could be groundbreaking, we question the notion that its feats were done without the use of advanced GPUs to fine-tune it and/or build the underlying LLMs the final model is based on through the distillation technique."

Malik's questioning could have further weight, as while DeepSeek claims that its V3 model was trained using Nvidia H800 GPUs, a recent interview with Scale AI's founder Alexandr Wang on CNBC saw the company's CEO suggest "DeepSeek has about fifty thousand H100s."

The very same GPUs that were blocked from export to China by the Biden administration in 2023, with Wang continuing, "they can't talk about obviously because it is against the export controls that [the] United States has put in place."

Outlook

DeepSeek is a proven hit that will give companies like OpenAI something to consider when retaining its sizable user base in the face of stiff competition.

However, it remains to be seen if the new car smell still lingering on DeekSeek's latest models is masking the odor of misinformation surrounding how it developed its models and whether or not its pricing is sustainable in the long term.

Given the U.S.' recent reaction to TikTok, it's hard to imagine that a company like DeepSeek goes without serious scrutiny for much longer, especially as its models risk upsetting the apple cart on President Trump's plans to keep the United States as the "world capital of AI."

Rael Hornby
Content Editor

Rael Hornby, potentially influenced by far too many LucasArts titles at an early age, once thought he’d grow up to be a mighty pirate. However, after several interventions with close friends and family members, you’re now much more likely to see his name attached to the bylines of tech articles. While not maintaining a double life as an aspiring writer by day and indie game dev by night, you’ll find him sat in a corner somewhere muttering to himself about microtransactions or hunting down promising indie games on Twitter.