LLM Name | Sakura 13B LNovel V0.8 |
Repository ๐ค | https://huggingface.co/SakuraLLM/Sakura-13B-LNovel-v0.8 |
Model Size | 13b |
Required VRAM | 27.8 GB |
Updated | 2025-02-05 |
Maintainer | SakuraLLM |
Model Type | baichuan |
Model Files | |
Model Architecture | BaichuanForCausalLM |
License | apache-2.0 |
Model Max Length | 4096 |
Transformers Version | 4.33.3 |
Tokenizer Class | BaichuanTokenizer |
Beginning of Sentence Token | <s> |
End of Sentence Token | </s> |
Unk Token | <unk> |
Vocabulary Size | 125696 |
Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Tiny Random Baichuan2 13B | 0K / 0.1 GB | 128700 | 0 |
Baichuan2 13B Chat | 0K / 27.8 GB | 32732 | 425 |
Baichuan 13B Chat | 0K / 26.5 GB | 3356 | 631 |
ShieldLM 13B Baichuan2 | 0K / 27.8 GB | 7 | 3 |
Baichuan2 13B Base | 0K / 27.8 GB | 1180 | 78 |
Blossom V3.1 Baichuan2 13B | 0K / 27.8 GB | 5 | 1 |
HuatuoGPT2 13B | 0K / 29.1 GB | 39 | 6 |
Buffer Baichuan2 13B Rag 4bits | 0K / 9.9 GB | 6 | 0 |
Buffer Baichuan2 13B Rag | 0K / 27.8 GB | 15 | 1 |
... Efficient Training Of LLMs V1 | 0K / 29.1 GB | 34 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐