Mixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2 by LoneStriker

 ยป  All LLMs  ยป  LoneStriker  ยป  Mixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2   URL Share it on

  Autotrain compatible   Endpoints compatible   Exl2   Mixtral   Moe   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Mixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2 (LoneStriker/Mixtral_34Bx2_MoE_60B-3.0bpw-h6-exl2)

Mixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2 Parameters and Internals

Model Type 
MoE, causal language model
Use Cases 
Areas:
research, non-commercial applications
Applications:
text generation, multilingual applications
Primary Use Cases:
creative writing, storytelling, education
Limitations:
commercial applications
Supported Languages 
English (high), Chinese (high)
Input Output 
Input Format:
text prompt
Accepted Modalities:
text
Output Format:
generated text
Performance Tips:
For better performance, use a GPU with CUDA support.
LLM NameMixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2
Repository ๐Ÿค—https://huggingface.co/LoneStriker/Mixtral_34Bx2_MoE_60B-3.0bpw-h6-exl2 
Model Size60b
Required VRAM23.8 GB
Updated2025-02-05
MaintainerLoneStriker
Model Typemixtral
Model Files  8.6 GB: 1-of-3   8.6 GB: 2-of-3   6.6 GB: 3-of-3
Quantization Typeexl2
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length200000
Model Max Length200000
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16

Best Alternatives to Mixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2

Best Alternatives
Context / RAM
Downloads
Likes
... 34Bx2 MoE 60B 2.65bpw H6 EXL2195K / 21.2 GB61
...i 34Bx2 MoE 60B 5.0bpw H6 EXL2195K / 38.8 GB51
...i 34Bx2 MoE 60B 6.0bpw H6 EXL2195K / 46.2 GB41
...l 34Bx2 MoE 60B 8.0bpw H8 EXL2195K / 61.3 GB42
...l 34Bx2 MoE 60B 2.4bpw H6 EXL2195K / 19.3 GB53
... 34Bx2 MoE 60B 4.65bpw H6 EXL2195K / 36.2 GB41
...l 34Bx2 MoE 60B EXL2 2.0bpw H8195K / 18.1 GB71
... 34Bx2 MoE 60B 2.65bpw H6 EXL2195K / 21.2 GB42
...l 34Bx2 MoE 60B 4.0bpw H6 EXL2195K / 31.3 GB52
...l 34Bx2 MoE 60B 6.0bpw H6 EXL2195K / 46.2 GB61
Note: green Score (e.g. "73.2") means that the model is better than LoneStriker/Mixtral_34Bx2_MoE_60B-3.0bpw-h6-exl2.

Rank the Mixtral 34Bx2 MoE 60B 3.0bpw H6 EXL2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42625 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227