Deepseek MoE 16B Base by vltnmmdv

 ยป  All LLMs  ยป  vltnmmdv  ยป  Deepseek MoE 16B Base   URL Share it on

  Arxiv:1910.09700   Autotrain compatible   Custom code   Deepseek with concentration   Moe   Region:us   Safetensors   Sharded   Tensorflow

Deepseek MoE 16B Base Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Deepseek MoE 16B Base (vltnmmdv/deepseek-moe-16b-base)

Deepseek MoE 16B Base Parameters and Internals

LLM NameDeepseek MoE 16B Base
Repository ๐Ÿค—https://huggingface.co/vltnmmdv/deepseek-moe-16b-base 
Model Size16.4b
Required VRAM65.6 GB
Updated2025-02-22
Maintainervltnmmdv
Model Typedeepseek_with_concentration
Model Files  5.0 GB: 1-of-14   5.0 GB: 2-of-14   5.0 GB: 3-of-14   5.0 GB: 4-of-14   5.0 GB: 5-of-14   5.0 GB: 6-of-14   5.0 GB: 7-of-14   5.0 GB: 8-of-14   5.0 GB: 9-of-14   5.0 GB: 10-of-14   5.0 GB: 11-of-14   5.0 GB: 12-of-14   4.8 GB: 13-of-14   0.8 GB: 14-of-14
Model ArchitectureDeepseekFixedForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.43.3
Vocabulary Size102400
Torch Data Typefloat32

Rank the Deepseek MoE 16B Base Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227