Merged Model MoE by EthanLiu1991

 ยป  All LLMs  ยป  EthanLiu1991  ยป  Merged Model MoE   URL Share it on

  Merged Model   Autotrain compatible   Conversational   Endpoints compatible   Mixtral   Region:us   Safetensors   Sharded   Tensorflow

Merged Model MoE Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Merged Model MoE Parameters and Internals

LLM NameMerged Model MoE
Repository ๐Ÿค—https://huggingface.co/EthanLiu1991/Merged_model_MoE 
Merged ModelYes
Model Size7b
Required VRAM53.3 GB
Updated2024-09-16
MaintainerEthanLiu1991
Model Typemixtral
Model Files  1.9 GB: 1-of-13   1.9 GB: 1-of-15   2.0 GB: 2-of-13   1.9 GB: 2-of-15   2.0 GB: 3-of-13   1.9 GB: 3-of-15   2.0 GB: 4-of-13   1.9 GB: 4-of-15   2.0 GB: 5-of-13   1.9 GB: 5-of-15   2.0 GB: 6-of-13   1.9 GB: 6-of-15   2.0 GB: 7-of-13   1.9 GB: 7-of-15   2.0 GB: 8-of-13   1.9 GB: 8-of-15   2.0 GB: 9-of-13   1.9 GB: 9-of-15   2.0 GB: 10-of-13   1.9 GB: 10-of-15   2.0 GB: 11-of-13   1.9 GB: 11-of-15   2.0 GB: 12-of-13   1.9 GB: 12-of-15   1.9 GB: 13-of-13   1.9 GB: 13-of-15   2.0 GB: 14-of-15   0.8 GB: 15-of-15
Model ArchitectureMixtralForCausalLM
Licensecc-by-nc-4.0
Context Length32768
Model Max Length32768
Transformers Version4.41.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16
Merged Model MoE (EthanLiu1991/Merged_model_MoE)

Best Alternatives to Merged Model MoE

Best Alternatives
Context / RAM
Downloads
Likes
Multimaster 7B V632K / 142.5 GB31521
Mixtral 7B 8expert32K / 93.6 GB15574263
Laserxtral32K / 48.3 GB488578
MultiverseBuddy 15B MoE32K / 25.8 GB60
Mini Mixtral V0.232K / 25.8 GB633
Multilingual Mistral32K / 93.5 GB6742
Lumina 232K / 37.1 GB50
RogerWizard 12B MoE32K / 25.8 GB11
StarlingMaths 12B MoE32K / 25.8 GB50
MultiverseMath 12B MoE32K / 25.8 GB60
Note: green Score (e.g. "73.2") means that the model is better than EthanLiu1991/Merged_model_MoE.

Rank the Merged Model MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 35926 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803