Open Release of Grok-1
March 17, 2024March 17, 2024We are releasing the weights and architecture of our 314 billion parameter Mixture-of-Experts model, Grok-1.We are releasing the base model weights and network architecture of Grok-1, our large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.
This is the raw base model checkpoint from the Grok-1 pre-training phase, which concluded in October 2023. This means that the model is not fine-tuned for any specific applic...
Read more at x.ai