๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Model |
Likes |
Downloads |
VRAM |
---|---|---|---|
Openchat 3.5 16K GGUF | 22 | 39 | 3 GB |
Openchat 3.5 16K GPTQ | 6 | 14 | 4 GB |
Openchat 3.5 16K AWQ | 2 | 3723 | 4 GB |
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
Openchat 3.5 0106 128K DPO | — | 128K / 14.4 GB | 607 | 2 |
...nstruct V1 Gdpr Finetuned V1.2 | — | 32K / 4.1 GB | 18 | 0 |
Mixtral AI Cyber 3.0 | — | 32K / 14.3 GB | 620 | 0 |
Em German Leo Mistral | — | 32K / 14.4 GB | 2948 | 63 |
Multi Verse Model | — | 32K / 14.4 GB | 5728 | 34 |
Go Bruins V2 | — | 32K / 14.4 GB | 936 | 30 |
Go Bruins V2.1.1 | — | 32K / 14.4 GB | 1243 | 23 |
Em German Mistral V01 | — | 32K / 14.4 GB | 107 | 18 |
Phoenix | — | 32K / 14.4 GB | 25 | 16 |
Navarna V0 1 OpenHermes Hindi | — | 32K / 14.4 GB | 14 | 16 |
LLM Name | Openchat 3.5 16K |
Repository | Open on ๐ค |
Model Size | 7.2b |
Required VRAM | 14.4 GB |
Updated | 2024-07-07 |
Maintainer | NurtureAI |
Model Type | mistral |
Model Files | |
Model Architecture | MistralForCausalLM |
License | apache-2.0 |
Context Length | 32768 |
Model Max Length | 32768 |
Transformers Version | 4.34.0 |
Tokenizer Class | LlamaTokenizer |
Vocabulary Size | 32002 |
Initializer Range | 0.02 |
Torch Data Type | float16 |