Openbuddy Mixtral 7bx8 V18.1 32K by OpenBuddy

 ยป  All LLMs  ยป  OpenBuddy  ยป  Openbuddy Mixtral 7bx8 V18.1 32K   URL Share it on

  Autotrain compatible   Conversational   De   En   Fr   It   Ja   Ko   Mixtral   Model-index   Moe   Region:us   Ru   Safetensors   Sharded   Tensorflow   Zh

Openbuddy Mixtral 7bx8 V18.1 32K Benchmarks

Openbuddy Mixtral 7bx8 V18.1 32K (OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k)

Openbuddy Mixtral 7bx8 V18.1 32K Parameters and Internals

Model Type 
text-generation
Additional Notes 
The model has limitations and may produce undesirable outputs. Caution is advised in critical situations.
Supported Languages 
zh (unknown), en (unknown), fr (unknown), de (unknown), ja (unknown), ko (unknown), it (unknown), ru (unknown)
LLM NameOpenbuddy Mixtral 7bx8 V18.1 32K
Repository ๐Ÿค—https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k 
Model Size46.7b
Required VRAM93.7 GB
Updated2024-12-26
MaintainerOpenBuddy
Model Typemixtral
Model Files  4.9 GB: 1-of-19   5.0 GB: 2-of-19   5.0 GB: 3-of-19   4.9 GB: 4-of-19   5.0 GB: 5-of-19   5.0 GB: 6-of-19   4.9 GB: 7-of-19   5.0 GB: 8-of-19   5.0 GB: 9-of-19   4.9 GB: 10-of-19   5.0 GB: 11-of-19   5.0 GB: 12-of-19   5.0 GB: 13-of-19   4.9 GB: 14-of-19   5.0 GB: 15-of-19   5.0 GB: 16-of-19   4.9 GB: 17-of-19   5.0 GB: 18-of-19   4.3 GB: 19-of-19
Supported Languageszh en fr de ja ko it ru
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size36608
Torch Data Typebfloat16

Best Alternatives to Openbuddy Mixtral 7bx8 V18.1 32K

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 8x7B Instruct V0.132K / 93.6 GB31616584237
Mixtral 8x7B V0.132K / 93.6 GB29685731655
Nous Hermes 2 Mixtral 8x7B DPO32K / 93.6 GB3144420
Dolphin 2.5 Mixtral 8x7b32K / 93.6 GB194401221
GritLM 8x7B KTO32K / 93.6 GB35243
Smaug Mixtral V0.132K / 187.7 GB355412
Merge Mixtral Prometheus 8x7B32K / 91.9 GB262
XLAM 8x7b R32K / 93.6 GB130611
Sensualize Mixtral Bf1632K / 93.6 GB00
Notux 8x7b V132K / 93.6 GB41165
Note: green Score (e.g. "73.2") means that the model is better than OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k.

Rank the Openbuddy Mixtral 7bx8 V18.1 32K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40248 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217