LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

SauerkrautLM Mixtral 8x7B Instruct by VAGOsolutions

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  VAGOsolutions  ยป  SauerkrautLM Mixtral 8x7B Instruct   URL Share it on

  Augmentation   Autotrain compatible   Conversational Dataset:argilla/distilabel-mat...   De   Dpo   En   Endpoints compatible   Es   Finetuned   Fr   German   Has space   Instruct   It   License:apache-2.0   Mistral   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

SauerkrautLM Mixtral 8x7B Instruct Benchmarks

Rank the SauerkrautLM Mixtral 8x7B Instruct Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
SauerkrautLM Mixtral 8x7B Instruct (VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct)

Quantized Models of the SauerkrautLM Mixtral 8x7B Instruct

Model
Likes
Downloads
VRAM
...tLM Mixtral 8x7B Instruct GGUF7415 GB
...tLM Mixtral 8x7B Instruct GPTQ343623 GB
...utLM Mixtral 8x7B Instruct AWQ273624 GB
...utLM Mixtral 8x7B Instruct AWQ06927 GB

Best Alternatives to SauerkrautLM Mixtral 8x7B Instruct

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
BagelMIsteryTour V2 8x7B74.9532K / 93.5 GB9838
BagelMIsteryTour 8x7B74.6632K / 93.5 GB13434
UNAversal 8x7B V1beta73.7832K / 93.6 GB20668
Mixtral 8x7B Instruct V0.1 DPO73.4432K / 93.6 GB18511
Notux 8x7b V172.9732K / 93.6 GB2885155
MPOMixtral 8x7B Instruct V0.172.832K / 93.6 GB23660
TenyxChat 8x7B V172.7232K / 93.6 GB168512
Mixtral 8x7B Instruct V0.172.6232K / 93.6 GB10649192938
...uct Mixtral 8x7B V0.1 Dolly15K72.4432K / 93.6 GB27640
Chinese Mixtral Instruct70.1932K / 93.6 GB9165
Note: green Score (e.g. "73.2") means that the model is better than VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct.

SauerkrautLM Mixtral 8x7B Instruct Parameters and Internals

LLM NameSauerkrautLM Mixtral 8x7B Instruct
RepositoryOpen on ๐Ÿค— 
Model Size46.7b
Required VRAM93.6 GB
Updated2024-02-21
MaintainerVAGOsolutions
Model Typemixtral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-19   5.0 GB: 2-of-19   5.0 GB: 3-of-19   4.9 GB: 4-of-19   5.0 GB: 5-of-19   5.0 GB: 6-of-19   4.9 GB: 7-of-19   5.0 GB: 8-of-19   5.0 GB: 9-of-19   4.9 GB: 10-of-19   5.0 GB: 11-of-19   5.0 GB: 12-of-19   5.0 GB: 13-of-19   4.9 GB: 14-of-19   5.0 GB: 15-of-19   5.0 GB: 16-of-19   4.9 GB: 17-of-19   5.0 GB: 18-of-19   4.2 GB: 19-of-19
Supported Languagesen de fr it es
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003