๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
Starcoder2 15B Instruct V0.1 | — | 16K / 31.9 GB | 2678 | 72 |
Dolphincoder Starcoder2 15B | — | 16K / 31.9 GB | 70 | 65 |
Starcoder2 15B Instruct | — | 16K / 31.9 GB | 268 | 5 |
Starchat2 15B Sft V0.1 | — | 16K / 31.9 GB | 206 | 3 |
OpenCodeInterpreter SC2 15B | — | 16K / 31.9 GB | 103 | 3 |
Starcoder2 15B OCI | — | 16K / 31.9 GB | 3 | 3 |
Opencsg Starcoder2 15B V0.1 | — | 16K / 31.9 GB | 7 | 2 |
Starcoder2 15B | — | 16K / 63.8 GB | 41081 | 496 |
Starcoder2 15B 4bit | — | 16K / 9.5 GB | 6 | 0 |
Speechless Starcoder2 15B | — | 16K / 31.9 GB | 119 | 2 |
LLM Name | Starchat2 15B V0.1 |
Repository | Open on ๐ค |
Base Model(s) | |
Model Size | 15b |
Required VRAM | 31.9 GB |
Updated | 2024-05-14 |
Maintainer | HuggingFaceH4 |
Model Type | starcoder2 |
Model Files | |
Model Architecture | Starcoder2ForCausalLM |
Context Length | 16384 |
Model Max Length | 16384 |
Transformers Version | 4.39.0.dev0 |
Tokenizer Class | GPT2Tokenizer |
Padding Token | <|im_end|> |
Vocabulary Size | 49154 |
Initializer Range | 0.01275 |
Torch Data Type | bfloat16 |