Manbasya 2x7b MoE by arlineka

 ยป  All LLMs  ยป  arlineka  ยป  Manbasya 2x7b MoE   URL Share it on

  4-bit   Autotrain compatible   Awq   Endpoints compatible   Mixtral   Moe   Region:us   Safetensors

Manbasya 2x7b MoE Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Manbasya 2x7b MoE (arlineka/manbasya_2x7b_MOE)

Manbasya 2x7b MoE Parameters and Internals

Model Type 
causal language model
Input Output 
Input Format:
text
Accepted Modalities:
text
Output Format:
generated text
LLM NameManbasya 2x7b MoE
Repository ๐Ÿค—https://huggingface.co/arlineka/manbasya_2x7b_MOE 
Model Size2b
Required VRAM7.1 GB
Updated2025-02-05
Maintainerarlineka
Model Typemixtral
Model Files  7.1 GB
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Manbasya 2x7b MoE

Best Alternatives
Context / RAM
Downloads
Likes
...rc FusionNet 7Bx2 MoE 13B GPTQ32K / 7.1 GB931
...Grc FusionNet 7Bx2 MoE 13B AWQ32K / 7.1 GB923
MixTAO 7Bx2 MoE V8.1 AWQ32K / 7.1 GB771
...r Dolphin Mixtral 2x7b DPO AWQ32K / 7.1 GB31
...r Dolphin Mixtral 2x7b DPO AWQ32K / 7.1 GB739
... Dolphin Mixtral 2x7b DPO GPTQ32K / 7.1 GB1710
Mixtral 7Bx2 MoE GPTQ32K / 7.1 GB358
Mixtral 7Bx2 MoE AWQ32K / 7.1 GB172
Blue Orchid 2x7b AWQ8K / 7.1 GB291
Note: green Score (e.g. "73.2") means that the model is better than arlineka/manbasya_2x7b_MOE.

Rank the Manbasya 2x7b MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227