Minerva MoE 2x3B by ludocomito

 ยป  All LLMs  ยป  ludocomito  ยป  Minerva MoE 2x3B   URL Share it on

  Autotrain compatible Base model:deepmount00/minerva... Base model:fairmind/minerva-3b... Base model:merge:deepmount00/m... Base model:merge:fairmind/mine... Deepmount00/minerva-3b-base-ra...   Endpoints compatible Fairmind/minerva-3b-instruct-v...   Frankenmoe   Instruct   Lazymergekit   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Minerva MoE 2x3B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Minerva MoE 2x3B (ludocomito/Minerva-MoE-2x3B)

Minerva MoE 2x3B Parameters and Internals

Model Type 
Mixture of Experts (MoE)
Additional Notes 
Minerva-MoE-3x3B is a Mixture of Experts model created using LazyMergekit by combining DeepMount00/Minerva-3B-base-RAG and FairMind/Minerva-3B-Instruct-v1.0 models.
LLM NameMinerva MoE 2x3B
Repository ๐Ÿค—https://huggingface.co/ludocomito/Minerva-MoE-2x3B 
Base Model(s)  Minerva 3B Base RAG   FairMind/Minerva-3B-Instruct-v1.0   DeepMount00/Minerva-3B-base-RAG   FairMind/Minerva-3B-Instruct-v1.0
Model Size5.1b
Required VRAM10.2 GB
Updated2025-02-22
Maintainerludocomito
Model Typemixtral
Instruction-BasedYes
Model Files  10.0 GB: 1-of-2   0.2 GB: 2-of-2
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length16384
Model Max Length16384
Transformers Version4.40.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32768
Torch Data Typebfloat16

Rank the Minerva MoE 2x3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227