Openbuddy Mixtral 7bx8 V18.1 32K Gptq by OpenBuddy

 ยป  All LLMs  ยป  OpenBuddy  ยป  Openbuddy Mixtral 7bx8 V18.1 32K Gptq   URL Share it on

  4-bit   4bit   Autotrain compatible   Conversational   De   En   Fr   Gptq   It   Ja   Ko   Mixtral   Moe   Quantized   Region:us   Ru   Zh

Openbuddy Mixtral 7bx8 V18.1 32K Gptq Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Openbuddy Mixtral 7bx8 V18.1 32K Gptq (OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k-gptq)

Openbuddy Mixtral 7bx8 V18.1 32K Gptq Parameters and Internals

Model Type 
text generation
Additional Notes 
OpenBuddy models are provided "as-is" without any warranty.
Supported Languages 
zh (full proficiency), en (full proficiency), fr (full proficiency), de (full proficiency), ja (full proficiency), ko (full proficiency), it (full proficiency), ru (full proficiency)
LLM NameOpenbuddy Mixtral 7bx8 V18.1 32K Gptq
Repository ๐Ÿค—https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k-gptq 
Required VRAM24.8 GB
Updated2025-02-22
MaintainerOpenBuddy
Model Typemixtral
Model Files  24.8 GB
Supported Languageszh en fr de ja ko it ru
GPTQ QuantizationYes
Quantization Typegptq|4bit
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size36608
Torch Data Typefloat16

Best Alternatives to Openbuddy Mixtral 7bx8 V18.1 32K Gptq

Best Alternatives
Context / RAM
Downloads
Likes
...y Mixtral 22bx8 V21.1 65K Gptq64K /  GB50
LHK DPO V1 GPTQ 4bit32K / 7.8 GB51
Mixtral 8x7B V0.1 Int8 GPTQ32K /  GB122
Blue Orchid 2x7b GPTQ8K / 7.1 GB561
...oE V0.1 DPO F16 5.0bpw H6 EXL2195K / 38.8 GB100
...oE V0.1 DPO F16 4.0bpw H6 EXL2195K / 31.3 GB80
...2 Mixtral 8x22b 6.0bpw H8 EXL264K / 105.8 GB71
WizardLM 2 8x22 EXL2 4.0bpw64K / 70.9 GB81
...rdLM 2 8x22B Beige EXL2 5.0bpw64K / 88.4 GB170
...M 2 8x22B Beige 4.0bpw H6 EXL264K / 70.8 GB130
Note: green Score (e.g. "73.2") means that the model is better than OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k-gptq.

Rank the Openbuddy Mixtral 7bx8 V18.1 32K Gptq Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227