XDAN APUS4 MoE V3.1 0410 by xDAN2099

 ยป  All LLMs  ยป  xDAN2099  ยป  XDAN APUS4 MoE V3.1 0410   URL Share it on

  Autotrain compatible   Conversational   Endpoints compatible   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Rank the XDAN APUS4 MoE V3.1 0410 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
XDAN APUS4 MoE V3.1 0410 (xDAN2099/xDAN-APUS4-MoE-v3.1-0410)

Best Alternatives to XDAN APUS4 MoE V3.1 0410

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
XDAN APUS4.0 MoE Initial195K / 227.5 GB140
Helion 4x34B195K / 227.7 GB338911
Astralis 4x34B195K / 227.7 GB33883
APUS XDAN 4.0 MoE V232K / 227.6 GB1610

XDAN APUS4 MoE V3.1 0410 Parameters and Internals

LLM NameXDAN APUS4 MoE V3.1 0410
RepositoryOpen on ๐Ÿค— 
Model Size113.7b
Required VRAM227.6 GB
Updated2024-05-20
MaintainerxDAN2099
Model Typemixtral
Model Files  9.9 GB: 1-of-24   9.9 GB: 2-of-24   9.8 GB: 3-of-24   10.0 GB: 4-of-24   10.0 GB: 5-of-24   9.8 GB: 6-of-24   9.9 GB: 7-of-24   9.8 GB: 8-of-24   9.8 GB: 9-of-24   9.9 GB: 10-of-24   9.8 GB: 11-of-24   10.0 GB: 12-of-24   10.0 GB: 13-of-24   9.8 GB: 14-of-24   9.9 GB: 15-of-24   9.8 GB: 16-of-24   9.8 GB: 17-of-24   9.9 GB: 18-of-24   9.8 GB: 19-of-24   10.0 GB: 20-of-24   10.0 GB: 21-of-24   9.8 GB: 22-of-24   9.9 GB: 23-of-24   0.3 GB: 24-of-24
Model ArchitectureMixtralForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.39.2
Tokenizer ClassLlamaTokenizer
Padding Token<|im_start|>
Vocabulary Size64000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 34817 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801