In a fierce race in the AI GPU market, power-hungry AI GPUs are undergoing fast-paced, dramatic metamorphoses, driven by a growing range of demanding AI applications. For several recent decades, the undisputed industry-leading company in the AI market has been NVIDIA, which rocket ship’s many diverse families of AI applications to the stars. That is about to change. In a recent news, the NVIDIA-competitor Moore Threads, a new nation-controlled Chinese company, announced the launch of the company’s first AI GPU, the MTT-S4000, to power artificial intelligence applications, aiming to challenge NVIDIA’S supremacy in the AI domain.
It has trained a large language model (LLM) with three billion parameters, and the company says the MTT-S4000 can easily train larger models. To date, the largest LLM yet trained runs on a behemoth of a computer and can spit out 12 billion words an hour. Moore Threads’ chip can train a three billion parameter model, and the company says the MTT-S4000 card can do the job easily. ‘Even if we do nothing new, the chip is a total game-changer, not only for AI but also for general use,’ he adds. ‘We now have the potential to gain a large lead over the NVIDIAs of the world.
With 16 GB of GDDR6 memory, PCIe 4.0 connectivity, and manufactured on a 7 nm process, the MTT-S4000 is designed to maximise performance with a peak power draw of 250 W. Pitted against NVIDIA’s unspecified solutions during an LLM training match, the GPU’s performance was highly competitive, potentially indicating a new computing landscape in the AI GPU sector.
The fact that the MTT-S4000 is being deployed in the training of large language models (LLMs) – perhaps the most challenging and ambitious real-world application for AI – demonstrates just how pervasive this technology is becoming, and how the stakes for AI are rising in every area of human activity, from tech to medicine to finance. Of course, that means ever-greater demand for fast and powerful AI GPUs. The development of the MTT-S4000 represents the kind of breakthrough that could just dethrone NVIDIA’s dominance of the AI GPU market, should Moore Threads’ future success compel tech giants like Alphabet, Microsoft, Amazon, Meta, and others to explore other options.
It’s another AI processing unit, and so far NVIDIA have been the ones everyone thinks of, but the fact that Moore Threads’ MTT-S4000 is even here at all shows that this is a field that’s ripe for disruption. That’s good news for tech, as it means that this growing sector will be able to innovate more – and good news for customers, who might find that one company’s AI solution suits them better than another’s.
Even with the tremendous market presence of Moore Threads, it should be noted that NVIDIA still maintains a dominant position when it comes to developing AI and deep learning technologies, and they have made considerable efforts to stay at the top and demonstrate long-term commitment. However, the development of the MTT-S4000 by Moore Threads was a clear sign: the ring can accommodate more, and NVIDIA might just need to watch its back.
To sustain its leadership in AI, NVIDIA is probably going to push the limits of AI GPU technology even further. This ethos of innovation also helps to futurise not only NVIDIA’s products but also the entire industry by pushing its products into new areas and to constantly adapt to the fast-changing needs of AI applications.
And, as the AI space grows, expect that rivalry between NVIDIA and up-and-coming entities such as Moore Threads to grow as well – something that tends to accelerate moves to develop newer, more powerful and more efficient AI processing units. For whatever applications these APU research and development efforts are aimed at, for end-users and anyone who benefits from growing AI, this is just a good thing.
Companies like NVIDIA, a giant in the arena of computing, have transformed the world using innovations such as the design of the GPUs and artificial intelligence and other technologies. For a long time, NVIDIA has continued to innovate through cutting‑edge research and development, making a considerable impact on the artificial intelligence, gaming and the professional visualization world. As the world increasingly turns to innovations and technologies in artificial intelligence, NVIDIA is using its resources to continue to steer innovation in AI and computing.
© 2025 UC Technology Inc . All Rights Reserved.