Mixtral SlimOrca 8x7B GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Mixtral SlimOrca 8x7B GPTQ   URL Share it on

  4-bit   Autotrain compatible Base model:open-orca/mixtral-s...   Dataset:open-orca/slimorca   Gptq   License:apache-2.0   Mixtral   Moe   Quantized   Region:us   Safetensors

Rank the Mixtral SlimOrca 8x7B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Mixtral SlimOrca 8x7B GPTQ (TheBloke/Mixtral-SlimOrca-8x7B-GPTQ)

Best Alternatives to Mixtral SlimOrca 8x7B GPTQ

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Mixtral 8x7B V0.1 GPTQ68.432K / 23.8 GB1789125
...ixtral 8x7B Instruct V0.1 GPTQ68.232K / 23.8 GB639752127
Dolphin 2.5 Mixtral 8x7b GPTQ32K / 23.8 GB17093
...Hermes 2 Mixtral 8x7B DPO GPTQ32K / 23.8 GB1146625
Dolphin 2.7 Mixtral 8x7b GPTQ32K / 23.8 GB23818
...Hermes 2 Mixtral 8x7B SFT GPTQ32K / 23.8 GB2910
...nthia MoE V3 Mixtral 8x7B GPTQ32K / 23.8 GB1310
Open Gpt4 8x7B GPTQ32K / 23.8 GB89
...maid V0.1 Mixtral 8x7b V3 GPTQ32K / 23.8 GB748
MixtralOrochi8x7B GPTQ32K / 23.8 GB27
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Mixtral-SlimOrca-8x7B-GPTQ.

Mixtral SlimOrca 8x7B GPTQ Parameters and Internals

LLM NameMixtral SlimOrca 8x7B GPTQ
RepositoryOpen on ๐Ÿค— 
Model NameMixtral SlimOrca 8X7B
Model CreatorOpenOrca
Base Model(s)  Mixtral SlimOrca 8x7B   Open-Orca/Mixtral-SlimOrca-8x7B
Model Size6.1b
Required VRAM23.8 GB
Updated2024-07-07
MaintainerTheBloke
Model Typemixtral
Model Files  23.8 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32002
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 34531 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801