WizardLM 2 8x22B Beige 5.0bpw H6 EXL2 by LoneStriker

 ยป  All LLMs  ยป  LoneStriker  ยป  WizardLM 2 8x22B Beige 5.0bpw H6 EXL2   URL Share it on

  Merged Model   Arxiv:2403.19522   5-bit   Autotrain compatible Base model:alpindale/wizardlm-... Base model:fireworks-ai/mixtra... Base model:migtissera/tess-2.0... Base model:openbmb/eurux-8x22b...   Conversational   Endpoints compatible   Exl2   Instruct   Mixtral   Moe   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Rank the WizardLM 2 8x22B Beige 5.0bpw H6 EXL2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
WizardLM 2 8x22B Beige 5.0bpw H6 EXL2 (LoneStriker/WizardLM-2-8x22B-Beige-5.0bpw-h6-exl2)

Best Alternatives to WizardLM 2 8x22B Beige 5.0bpw H6 EXL2

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...es Mixtral 8x7B 2.4bpw H6 EXL268.232K / 14.3 GB12
...es Mixtral 8x7B 3.0bpw H6 EXL268.232K / 17.8 GB11
Notux 8x7b V1.3.5bpw EXL268.232K / 20.7 GB12
Notux 8x7b V1.3.5bpw H6 EXL268.232K / 20.7 GB11
...es Mixtral 8x7B 6.0bpw H6 EXL268.232K / 35.3 GB11
...ixtral 8x7B Instruct V0.1 GGUF68.232K / 17.3 GB4415
...8x22b Instruct Oh EXL2 2.25bpw64K / 40.1 GB41
...M 2 8x22B Beige 2.4bpw H6 EXL264K / 42.7 GB60
...M 2 8x22B Beige 3.0bpw H6 EXL264K / 53.2 GB150
...M 2 8x22B Beige 4.0bpw H6 EXL264K / 70.8 GB860
Note: green Score (e.g. "73.2") means that the model is better than LoneStriker/WizardLM-2-8x22B-Beige-5.0bpw-h6-exl2.

WizardLM 2 8x22B Beige 5.0bpw H6 EXL2 Parameters and Internals

LLM NameWizardLM 2 8x22B Beige 5.0bpw H6 EXL2
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Eurux 8x22b Nca   WizardLM 2 8x22B   Mixtral 8x22b Instruct Oh   Tess 2.0 Mixtral 8x22B   openbmb/Eurux-8x22b-nca   alpindale/WizardLM-2-8x22B   fireworks-ai/mixtral-8x22b-instruct-oh   migtissera/Tess-2.0-Mixtral-8x22B
Merged ModelYes
Required VRAM88.5 GB
Updated2024-07-04
MaintainerLoneStriker
Model Typemixtral
Instruction-BasedYes
Model Files  8.6 GB: 1-of-11   8.6 GB: 2-of-11   8.6 GB: 3-of-11   8.6 GB: 4-of-11   8.6 GB: 5-of-11   8.6 GB: 6-of-11   8.5 GB: 7-of-11   8.6 GB: 8-of-11   8.6 GB: 9-of-11   8.6 GB: 10-of-11   2.6 GB: 11-of-11
Quantization Typeexl2
Model ArchitectureMixtralForCausalLM
Context Length65536
Model Max Length65536
Transformers Version4.41.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 33742 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801