MultiverseBuddy 15B MoE by allknowingroger

 ยป  All LLMs  ยป  allknowingroger  ยป  MultiverseBuddy 15B MoE   URL Share it on

Allknowingroger/multiverseex26...   Autotrain compatible Base model:allknowingroger/mul... Base model:merge:allknowingrog... Base model:merge:openbuddy/ope... Base model:openbuddy/openbuddy...   Endpoints compatible   Frankenmoe   Lazymergekit   Merge   Mergekit   Mixtral   Moe Openbuddy/openbuddy-mistral2-7...   Region:us   Safetensors   Sharded   Tensorflow

MultiverseBuddy 15B MoE Benchmarks

MultiverseBuddy 15B MoE Parameters and Internals

LLM NameMultiverseBuddy 15B MoE
Repository ๐Ÿค—https://huggingface.co/allknowingroger/MultiverseBuddy-15B-MoE 
Base Model(s)  MultiverseEx26 7B Slerp   OpenBuddy/openbuddy-mistral2-7b-v20.2-32k   allknowingroger/MultiverseEx26-7B-slerp   OpenBuddy/openbuddy-mistral2-7b-v20.2-32k
Model Size7b
Required VRAM25.8 GB
Updated2024-10-17
Maintainerallknowingroger
Model Typemixtral
Model Files  1.9 GB: 1-of-13   2.0 GB: 2-of-13   2.0 GB: 3-of-13   2.0 GB: 4-of-13   2.0 GB: 5-of-13   2.0 GB: 6-of-13   2.0 GB: 7-of-13   2.0 GB: 8-of-13   2.0 GB: 9-of-13   2.0 GB: 10-of-13   2.0 GB: 11-of-13   2.0 GB: 12-of-13   1.9 GB: 13-of-13
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.40.1
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16
MultiverseBuddy 15B MoE (allknowingroger/MultiverseBuddy-15B-MoE)

Best Alternatives to MultiverseBuddy 15B MoE

Best Alternatives
Context / RAM
Downloads
Likes
Multimaster 7B V632K / 142.5 GB31371
Mixtral 7B 8expert32K / 93.6 GB14949264
Laserxtral32K / 48.3 GB461178
Mini Mixtral V0.232K / 25.8 GB763
Merged Model MoE32K / 53.3 GB81
Multilingual Mistral32K / 93.5 GB10252
Lumina 232K / 37.1 GB60
RogerWizard 12B MoE32K / 25.8 GB71
StarlingMaths 12B MoE32K / 25.8 GB70
MultiverseMath 12B MoE32K / 25.8 GB100
Note: green Score (e.g. "73.2") means that the model is better than allknowingroger/MultiverseBuddy-15B-MoE.

Rank the MultiverseBuddy 15B MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36966 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803