Grok-1 Is Here: It's The Largest Open-Source LLM With 300B Parameters
Elon Musk’s artificial intelligence company, xAI, has released the weights and architecture of their 314 billion parameter Mixture-of-Experts model, Grok-1, under the Apache 2.0 license.
Elon Musk’s artificial intelligence company, xAI, has released the weights and architecture of their 314 billion parameter Mixture-of-Experts model, Grok-1, under the Apache 2.0 license.
This follows a pledge by Musk last Monday to make Grok freely available to the public. As someone who has been closely following developments in the AI space, I have to say this is a massive step forward in terms of openness and accessibility.
What is Grok?
Grok is a huge language model with 314 billion parameters, making it the largest open-source model currently available. For context, that’s more than double the size of OpenAI’s GPT-3, which was considered a breakthrough when it was released in 2020.
In a chart shared by X user Andrew Kean Gao, you can see how huge Grok’s size is compared to its competitors.
Keep reading with a 7-day free trial
Subscribe to Generative AI Publication to keep reading this post and get 7 days of free access to the full post archives.