L3 MoE 4x8B Dark Planet Rebel FURY 25B by DavidAU

 ยป  All LLMs  ยป  DavidAU  ยป  L3 MoE 4x8B Dark Planet Rebel FURY 25B   URL Share it on

  Autotrain compatible   Conversational   Endpoints compatible   Merge   Mergekit   Mixtral   Mixture of experts   Moe   Region:us   Safetensors   Sharded   Tensorflow

L3 MoE 4x8B Dark Planet Rebel FURY 25B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
L3 MoE 4x8B Dark Planet Rebel FURY 25B (DavidAU/L3-MOE-4x8B-Dark-Planet-Rebel-FURY-25B)

L3 MoE 4x8B Dark Planet Rebel FURY 25B Parameters and Internals

LLM NameL3 MoE 4x8B Dark Planet Rebel FURY 25B
Repository ๐Ÿค—https://huggingface.co/DavidAU/L3-MOE-4x8B-Dark-Planet-Rebel-FURY-25B 
Model Size24.9b
Required VRAM50.1 GB
Updated2025-01-28
MaintainerDavidAU
Model Typemixtral
Model Files  4.9 GB: 1-of-11   5.0 GB: 2-of-11   4.9 GB: 3-of-11   5.0 GB: 4-of-11   5.0 GB: 5-of-11   4.9 GB: 6-of-11   5.0 GB: 7-of-11   5.0 GB: 8-of-11   4.9 GB: 9-of-11   4.4 GB: 10-of-11   1.1 GB: 11-of-11
Model ArchitectureMixtralForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.46.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|begin_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to L3 MoE 4x8B Dark Planet Rebel FURY 25B

Best Alternatives
Context / RAM
Downloads
Likes
L3.1 MoE 4x8B V0.1128K / 50.1 GB233
L3.1 ClaudeMaid 4x8B128K / 50.1 GB87
L3.1 MoE 4x8B V0.2128K / 50.1 GB112
Llama Salad 4x8B V38K / 50.1 GB45
L3 MoE 4X8B Grand Horror 25B8K / 50.1 GB70
...oE 4x8B Dark Planet Rising 25B8K / 50.1 GB60
OpenCrystal V4 L3 4x8B8K / 50 GB32
L3 SnowStorm V1.15 4x8B B8K / 49.9 GB299
...ama 3 Aplite Instruct 4x8B MoE8K / 50 GB13738
L3 SnowStorm V1.15 4x8B A8K / 49.9 GB201
Note: green Score (e.g. "73.2") means that the model is better than DavidAU/L3-MOE-4x8B-Dark-Planet-Rebel-FURY-25B.

Rank the L3 MoE 4x8B Dark Planet Rebel FURY 25B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227