Grok 1 by hpcai-tech

 ยป  All LLMs  ยป  hpcai-tech  ยป  Grok 1   URL Share it on

  Autotrain compatible   Custom code   Pytorch   Region:us
Model Card on HF ๐Ÿค—: https://huggingface.co/hpcai-tech/grok-1 

Grok 1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Grok 1 (hpcai-tech/grok-1)

Grok 1 Parameters and Internals

Model Type 
text-generation
Additional Notes 
Utilizes parallelism techniques from ColossalAI framework for accelerated inference. A PyTorch version converted from the original JAX model with a compatible tokenizer contributed by Xenova and ArthurZ.
LLM NameGrok 1
Repository ๐Ÿค—https://huggingface.co/hpcai-tech/grok-1 
Required VRAM9.7 GB
Updated2024-12-26
Maintainerhpcai-tech
Model Files  9.7 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB   9.8 GB
Model ArchitectureGrok1ModelForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.35.0
Vocabulary Size131072
Torch Data Typebfloat16

Rank the Grok 1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40303 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227