LLM Name | Etri Ones Solar |
Repository | Open on ๐ค |
Required VRAM | 42.9 GB |
Updated | 2024-07-27 |
Maintainer | leejaymin |
Model Type | llama |
Instruction-Based | Yes |
Model Files | |
Supported Languages | ko |
Model Architecture | LlamaForCausalLM |
License | mit |
Context Length | 4096 |
Model Max Length | 4096 |
Transformers Version | 4.34.1 |
Tokenizer Class | LlamaTokenizer |
Padding Token | <unk> |
Vocabulary Size | 32000 |
Torch Data Type | float32 |
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
Reverse Instruct | 0.2 | 32K / 27 GB | 12 | 3 |
Law Chat | 0.2 | 4K / 27 GB | 1373 | 27 |
Phi 3 Orpo V8.16 | 0.2 | 4K / 7.6 GB | 243 | 0 |
Tinyllama Python | 0.2 | 4K / 2.2 GB | 9 | 1 |
Small Instruct | 0.2 | 4K / 2.9 GB | 1754 | 1 |
Taiwan LLaMa V1.0 | 0.2 | 4K / 26 GB | 68 | 76 |
Taiwan LLaMa V0.0 | 0.2 | 4K / 26 GB | 16 | 1 |
Model 007 Preview | 0.2 | 4K / 138 GB | 16 | 1 |
Taiwan LLaMa V0.9 | 0.2 | 4K / 26 GB | 15 | 0 |
Kolong Llama V0.1 | 0.2 | 2K / 13.7 GB | 1748 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐