Repository brief
deepseek-ai/DeepSeek-V3
Read the upstream summary on the left, browse the cached forks below it, and load each fork comparison into the right-hand panel.
Cached analysis
cached 2026-03-29T22:26:40.066Z
deepseek-ai/DeepSeek-V3
DeepSeek-V3 is a large open-source MoE language model repository from deepseek-ai with very high community interest (102,414 stars, 16,610 forks). It includes model information, weights documentation, inference code, figures, and separate code/model licenses. The README describes a 671B-parameter model with 37B activated per token, trained on 14.8T tokens, with reported strong benchmark performance and stable training.
Stars102,414
Forks16,610
Default branchmain
Last pushed2025-08-28T03:24:37Z
Best maintainedNone
Closest to upstream7ttp/DeepSeek-V3
Most feature-richVPTQ/DeepSeek-V3
Most opinionatedVPTQ/DeepSeek-V3
Forks
Choose a fork to inspect
6 cached fork briefs