Model Type |
| |||
Additional Notes |
| |||
Training Details |
|
LLM Name | BigYi 15.75B 200K |
Repository ๐ค | https://huggingface.co/Joseph717171/BigYi-15.75B-200k |
Merged Model | Yes |
Model Size | 9b |
Required VRAM | 30.3 GB |
Updated | 2025-04-27 |
Maintainer | Joseph717171 |
Model Type | llama |
Model Files | |
Model Architecture | LlamaForCausalLM |
License | other |
Context Length | 262144 |
Model Max Length | 262144 |
Transformers Version | 4.40.0.dev0 |
Tokenizer Class | LlamaTokenizer |
Padding Token | <unk> |
Vocabulary Size | 64000 |
Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Yi 9B 200K | 256K / 17.7 GB | 4607 | 75 |
Yi Coder 9B Chat | 128K / 17.7 GB | 2217 | 202 |
Yi Coder 9B | 128K / 17.7 GB | 2391 | 43 |
Llm4decompile 9B V2 | 128K / 17.7 GB | 74 | 17 |
CursorCore Yi 9B | 128K / 17.7 GB | 2 | 1 |
Yi Coder 9B Chat Instruct TIES | 128K / 17.7 GB | 16 | 0 |
Yi 1.5 9B 32K | 32K / 17.7 GB | 5140 | 18 |
Faro Yi 9B DPO | 32K / 17.7 GB | 2011 | 29 |
EVA Yi 1.5 9B 32K V1 | 32K / 17.7 GB | 13 | 13 |
Faro Yi 9B | 32K / 17.7 GB | 1988 | 16 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐