Mixtral 8x7B V0.1 by mistralai

 ยป  All LLMs  ยป  mistralai  ยป  Mixtral 8x7B V0.1   URL Share it on

  Autotrain compatible   De   En   Endpoints compatible   Es   Fr   It   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Mixtral 8x7B V0.1 Benchmarks

Mixtral 8x7B V0.1 (mistralai/Mixtral-8x7B-v0.1)

Mixtral 8x7B V0.1 Parameters and Internals

Model Type 
generative, Sparse Mixture of Experts
Additional Notes 
Mixtral-8x7B is a pretrained base model and therefore does not have any moderation mechanisms.
Supported Languages 
fr (fluent), it (fluent), de (fluent), es (fluent), en (fluent)
LLM NameMixtral 8x7B V0.1
Repository ๐Ÿค—https://huggingface.co/mistralai/Mixtral-8x7B-v0.1 
Model Size46.7b
Required VRAM93.6 GB
Updated2025-02-05
Maintainermistralai
Model Typemixtral
Model Files  12.1 GB   12.1 GB   12.1 GB   12.1 GB   12.1 GB   12.1 GB   12.1 GB   12.1 GB   4.9 GB: 1-of-19   5.0 GB: 2-of-19   5.0 GB: 3-of-19   4.9 GB: 4-of-19   5.0 GB: 5-of-19   5.0 GB: 6-of-19   4.9 GB: 7-of-19   5.0 GB: 8-of-19   5.0 GB: 9-of-19   4.9 GB: 10-of-19   5.0 GB: 11-of-19   5.0 GB: 12-of-19   5.0 GB: 13-of-19   4.9 GB: 14-of-19   5.0 GB: 15-of-19   5.0 GB: 16-of-19   4.9 GB: 17-of-19   5.0 GB: 18-of-19   4.2 GB: 19-of-19
Supported Languagesfr it de es en
Gated ModelYes
Model ArchitectureMixtralForCausalLM
Licenseproprietary
Context Length32768
Model Max Length32768
Transformers Version4.36.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Mixtral 8x7B V0.1

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 8x7B Instruct V0.132K / 93.6 GB16141634282
Nous Hermes 2 Mixtral 8x7B DPO32K / 93.6 GB3797422
GritLM 8x7B KTO32K / 93.6 GB42163
Smaug Mixtral V0.132K / 187.7 GB421812
Sensualize Mixtral Bf1632K / 93.6 GB00
Dolphin 2.5 Mixtral 8x7b32K / 93.6 GB161381224
Skadi Mixtral V132K / 93.5 GB00
Franziska Mixtral V132K / 93.5 GB00
Typhon Mixtral V132K / 93.4 GB00
Merge Mixtral Prometheus 8x7B32K / 91.9 GB92
Note: green Score (e.g. "73.2") means that the model is better than mistralai/Mixtral-8x7B-v0.1.

Rank the Mixtral 8x7B V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227