Categories: Gadgets360

DeepSeek-V3 Open-Source AI Model With Mixture-of-Experts Architecture Released

DeepSeek, a Chinese artificial intelligence (AI) firm, released the DeepSeek-V3 AI model on Thursday. The new open-source large language model (LLM) features a massive 671 billion parameters, surpassing the Meta Llama 3.1 model which has 405 billion parameters. Despite its size, the researchers claimed that the LLM is focused towards efficiency with its mixture-of-expert (MoE) architecture. Due to this, the AI model can only activate specific parameters relevant to the task provided and ensure efficiency and accuracy. Notably, it is a text-based model and does not have multimodal capabilities.

DeepSeek-V3 AI Model Released

The open-source DeepSeek-V3 AI model is currently being hosted on Hugging Face. According to the listing, the LLM is geared towards efficient inference and cost-effective training. For this, the researchers adopted Multi-head Latent Attention (MLA) and DeepSeekMoE architectures.

Essentially, the AI model only activates the parameters which are relevant to the topic of the prompt, ensuring faster processing and higher accuracy compared to typical models of this size. Pre-trained on 14.8 trillion tokens, the DeepSeek-V3 uses techniques such as supervised fine-tuning and reinforcement learning to generate high-quality responses.

The Chinese firm claimed that despite its size, the AI model was fully trained in 2.788 million hours with the Nvidia H800 GPU. DeepSeek-V3’s architecture also includes a load-balancing technique to minimise performance degradation. This technique was first used on its predecessor.

Coming to performance, the researchers shared evals from internal testing of the model and claimed that it outperforms Meta Llama 3.1 and Qwen 2.5 models on the Big-Bench High-Performance (BBH), Massive Multitask Language Understanding (MMLU), HumanEval, MATH, and several other benchmarks. However, these are currently not verified by third-party researchers.

One of the main highlights of the DeepSeek-V3 is its massive size of 671 billion parameters. While larger models exist, for example, the Gemini 1.5 Pro has one trillion parameters, such size in the open source space is rare. Prior to this, the largest open-source AI model was Meta’s Llama 3.1 with 405 billion parameters.

At present, DeepSeek-V3’s code can be accessed by its Hugging Face listing under an MIT license for personal and commercial usage. Additionally, the AI model can also be tested via the company’s online chatbot platform. Those looking to build using the AI model can also access the API.

Recent Posts

Beyoncé’s NFL Christmas Halftime Show Now Streaming on Netflix: Everything You Need to Know

Beyoncé's much-anticipated halftime performance, part of Netflix's NFL Christmas Gameday event, is set to release…

9 months ago

Scientists Predict Under Sea Volcano Eruption Near Oregon Coast in 2025

An undersea volcano situated roughly 470 kilometers off Oregon's coastline, Axial Seamount, is showing signs…

9 months ago

Organic Molecules in Space: A Key to Understanding Life’s Cosmic Origins

As researchers delve into the cosmos, organic molecules—the building blocks of life—emerge as a recurring…

9 months ago

The Secret of the Shiledars OTT Release Date Announced: What You Need to Know

Director Aditya Sarpotdar, following his successful venture "Munjya," has announced the release of his treasure…

9 months ago

Anne Hathaway’s Mothers’ Instinct Now Streaming on Lionsgate Play

The psychological thriller Mothers' Instinct, featuring Anne Hathaway, Jessica Chastain, and Kelly Carmichael, delves into…

9 months ago

All We Imagine As Light OTT Release Date: When and Where to Watch it Online?

Payal Kapadia's award-winning film, All We Imagine As Light, will soon be available for streaming,…

9 months ago