Read the upstream summary on the left, browse the cached forks below it, and load each fork comparison into the right-hand panel.
Cached analysis
cached 2026-03-30T12:08:39.157Z
jingyaogong/minimind
MiniMind is a highly popular Python open source project for training a very small GPT-style language model from scratch. The repository emphasizes a full LLM training stack rather than just inference, with code and data around pretraining, SFT, LoRA, RLHF/RLAIF, tool use, agentic RL, distillation, evaluation, and a minimal OpenAI-compatible server and chat UI. It is actively maintained, not archived, and has strong adoption signals with 44,715 stars and 5,389 forks.