Is this product yours? Claim it
now to unlock deeper insights
into customer interactions. It's
quick and easy!
About
This is the base model weights and network architecture of Grok-1, xAI's large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts mo
Features
Grok-1
We are releasing the base model weights and network architecture of Grok-1, our large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.