L3.1 MoE 4x8B V0.1 by moeru-ai

 ยป  All LLMs  ยป  moeru-ai  ยป  L3.1 MoE 4x8B V0.1   URL Share it on

  Autotrain compatible Base model:argilla-warehouse/l... Base model:arliai/llama-3.1-8b... Base model:merge:argilla-wareh... Base model:merge:arliai/llama-... Base model:merge:sequelbox/lla... Base model:merge:sequelbox/lla... Base model:sequelbox/llama3.1-... Base model:sequelbox/llama3.1-...   Conversational   Endpoints compatible   Frankenmoe   Merge   Mergekit   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

L3.1 MoE 4x8B V0.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
L3.1 MoE 4x8B V0.1 (moeru-ai/L3.1-Moe-4x8B-v0.1)

L3.1 MoE 4x8B V0.1 Parameters and Internals

LLM NameL3.1 MoE 4x8B V0.1
Repository ๐Ÿค—https://huggingface.co/moeru-ai/L3.1-Moe-4x8B-v0.1 
Base Model(s)  argilla-warehouse/Llama-3.1-8B-MagPie-Ultra   sequelbox/Llama3.1-8B-PlumCode   sequelbox/Llama3.1-8B-PlumMath   ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.2   argilla-warehouse/Llama-3.1-8B-MagPie-Ultra   sequelbox/Llama3.1-8B-PlumCode   sequelbox/Llama3.1-8B-PlumMath   ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.2
Model Size24.9b
Required VRAM50.1 GB
Updated2025-02-05
Maintainermoeru-ai
Model Typemixtral
Model Files  4.9 GB: 1-of-11   5.0 GB: 2-of-11   4.9 GB: 3-of-11   5.0 GB: 4-of-11   5.0 GB: 5-of-11   4.9 GB: 6-of-11   5.0 GB: 7-of-11   5.0 GB: 8-of-11   4.9 GB: 9-of-11   4.4 GB: 10-of-11   1.1 GB: 11-of-11
Model ArchitectureMixtralForCausalLM
Licensellama3.1
Context Length131072
Model Max Length131072
Transformers Version4.45.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|begin_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to L3.1 MoE 4x8B V0.1

Best Alternatives
Context / RAM
Downloads
Likes
L3.1 ClaudeMaid 4x8B128K / 50.1 GB87
L3.1 MoE 4x8B V0.2128K / 50.1 GB112
Llama Salad 4x8B V38K / 50.1 GB45
L3 MoE 4X8B Grand Horror 25B8K / 50.1 GB70
...oE 4x8B Dark Planet Rising 25B8K / 50.1 GB60
...x8B Dark Planet Rebel FURY 25B8K / 50.1 GB50
OpenCrystal V4 L3 4x8B8K / 50 GB32
L3 SnowStorm V1.15 4x8B B8K / 49.9 GB299
...ama 3 Aplite Instruct 4x8B MoE8K / 50 GB13738
L3 SnowStorm V1.15 4x8B A8K / 49.9 GB201

Rank the L3.1 MoE 4x8B V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227