LLM Name | Gaja V2.00 DPO |
Repository ๐ค | https://huggingface.co/damerajee/Gaja-v2.00-dpo |
Model Size | 6.9b |
Required VRAM | 13.9 GB |
Updated | 2024-10-17 |
Maintainer | damerajee |
Model Type | llama |
Model Files | |
Supported Languages | en hi |
Model Architecture | LlamaForCausalLM |
License | llama2 |
Context Length | 4096 |
Model Max Length | 4096 |
Transformers Version | 4.38.2 |
Tokenizer Class | LlamaTokenizer |
Padding Token | <unk> |
Vocabulary Size | 48064 |
Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Bolna Lead Qualification | 16K / 13.8 GB | 3 | 3 |
Fiufiu 71 | 4K / 13.8 GB | 673 | 0 |
Fiufiu 87 | 4K / 13.8 GB | 501 | 0 |
Sn37 V3 | 4K / 13.8 GB | 417 | 0 |
Sn37 V5 | 4K / 13.8 GB | 319 | 0 |
Happymeow | 4K / 13.8 GB | 3167 | 0 |
SambaLingo Thai Chat | 4K / 13.9 GB | 558 | 36 |
Subnet9key16 | 4K / 13.8 GB | 3225 | 0 |
Nine Aa | 4K / 13.8 GB | 286 | 0 |
Jw2 | 4K / 13.8 GB | 430 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐