Mixtral 7Bx2 MoE by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  Mixtral 7Bx2 MoE   URL Share it on

  Autotrain compatible   Endpoints compatible   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

Mixtral 7Bx2 MoE Benchmarks

Mixtral 7Bx2 MoE Parameters and Internals

LLM NameMixtral 7Bx2 MoE
Repository ๐Ÿค—https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE 
Model Size12.9b
Required VRAM25.8 GB
Updated2024-09-07
Maintainercloudyu
Model Typemixtral
Model Files  9.9 GB: 1-of-3   10.0 GB: 2-of-3   5.9 GB: 3-of-3
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16
Mixtral 7Bx2 MoE (cloudyu/Mixtral_7Bx2_MoE)

Quantized Models of the Mixtral 7Bx2 MoE

Model
Likes
Downloads
VRAM
Mixtral 7Bx2 MoE GGUF21644 GB
Mixtral 7Bx2 MoE GGUF236944 GB
Mixtral 7Bx2 MoE AWQ2577 GB
Mixtral 7Bx2 MoE GPTQ8227 GB

Best Alternatives to Mixtral 7Bx2 MoE

Best Alternatives
Context / RAM
Downloads
Likes
MixTAO 7Bx2 MoE V8.132K / 25.8 GB887352
LogoS 7Bx2 MoE 13B V0.232K / 25.9 GB450910
MultiMash11 13B Slerp32K / 25.7 GB300
MultiMash8 13B Slerp32K / 25.7 GB160
MultiMash9 13B Slerp32K / 25.7 GB190
MultiMash10 13B Slerp32K / 25.7 GB150
MixTaoTruthful 13B Slerp32K / 25.7 GB140
Multimash3 12B Slerp32K / 25.7 GB130
MultiMash 12B Slerp32K / 25.7 GB240
MultiMash7 12B Slerp32K / 25.7 GB260
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/Mixtral_7Bx2_MoE.

Rank the Mixtral 7Bx2 MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 35693 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803