๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Model |
Likes |
Downloads |
VRAM |
---|---|---|---|
Mixtral 8x7B V0.1 GGUF | 415 | 10165 | 15 GB |
Mixtral 8x7B V0.1 GPTQ | 125 | 2284 | 23 GB |
...Hermes 2 Mixtral 8x7B DPO 4bit | 17 | 11 | 10 GB |
Mixtral 8x7b V0.1 AWQ | 10 | 883 | 24 GB |
...ixtral 8x7B DPO 5.0bpw H6 EXL2 | 7 | 5 | 29 GB |
... 8x7B DPO 3.7bpw H6 EXL2 Rpcal | 4 | 2 | 21 GB |
...ralRPChat ZLoss 3.5bpw H6 EXL2 | 4 | 1 | 20 GB |
...hon Mixtral V1.3.75bpw H6 EXL2 | 3 | 3 | 22 GB |
Mixtral 8x7B V0.1 Int8 GPTQ | 2 | 4 | 0 GB |
...ixtral 8x7B SFT 6.0bpw H6 EXL2 | 2 | 2 | 35 GB |
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
Smaug Mixtral V0.1 | 75.49 | 32K / 187.7 GB | 3213 | 12 |
BagelMIsteryTour V2 8x7B | 74.95 | 32K / 93.5 GB | 1350 | 16 |
BagelMIsteryTour 8x7B | 74.66 | 32K / 93.5 GB | 1350 | 4 |
Prokaryote 8x7B Bf16 | 74.53 | 32K / 93.5 GB | 2008 | 2 |
Mhm 8x7B FrankenMoE V1.0 | 74.01 | 32K / 93.5 GB | 1972 | 2 |
Typhon Mixtral V1 | 73.81 | 32K / 93.4 GB | 530 | 7 |
UNAversal 8x7B V1beta | 73.78 | 32K / 93.6 GB | 1927 | 8 |
Open Gpt4 8x7B V0.2 | 73.59 | 32K / 93.5 GB | 1998 | 9 |
Mixtral 8x7B Instruct V0.1 DPO | 73.44 | 32K / 93.6 GB | 1904 | 1 |
Franziska Mixtral V1 | 73.36 | 32K / 93.5 GB | 1402 | 3 |
LLM Name | Mixtral 8x7B V0.1 |
Repository | Open on ๐ค |
Merged Model | Yes |
Model Size | 46.7b |
Required VRAM | 93.6 GB |
Updated | 2024-06-24 |
Maintainer | mistralai |
Model Type | mixtral |
Model Files | |
Supported Languages | fr it de es en |
Gated Model | Yes |
Model Architecture | MixtralForCausalLM |
License | apache-2.0 |
Context Length | 32768 |
Model Max Length | 32768 |
Transformers Version | 4.36.0.dev0 |
Tokenizer Class | LlamaTokenizer |
Vocabulary Size | 32000 |
Initializer Range | 0.02 |
Torch Data Type | bfloat16 |