Deepseek Coder 33B Base by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  Deepseek Coder 33B Base   URL Share it on

  Autotrain compatible   Codegen   Endpoints compatible   Llama   Pytorch   Region:us   Safetensors   Sharded   Tensorflow

Deepseek Coder 33B Base Benchmarks

Deepseek Coder 33B Base (deepseek-ai/deepseek-coder-33b-base)

Deepseek Coder 33B Base Parameters and Internals

Model Type 
code generation, code completion
Use Cases 
Areas:
Research, Commercial applications
Applications:
Project-level code completion, Infilling tasks
Primary Use Cases:
Code generation, Code completion, Repository-level code completion
Limitations:
Not specified
Supported Languages 
English (High proficiency), Chinese (High proficiency)
Training Details 
Data Sources:
Project-level code corpus
Data Volume:
2 trillion tokens
Methodology:
Grouped-Query Attention
Context Length:
16000
Input Output 
Accepted Modalities:
Text
LLM NameDeepseek Coder 33B Base
Repository 🤗https://huggingface.co/deepseek-ai/deepseek-coder-33b-base 
Model Size33b
Required VRAM66.5 GB
Updated2025-02-05
Maintainerdeepseek-ai
Model Typellama
Model Files  9.7 GB: 1-of-7   9.9 GB: 2-of-7   9.9 GB: 3-of-7   9.8 GB: 4-of-7   9.9 GB: 5-of-7   9.9 GB: 6-of-7   7.4 GB: 7-of-7   9.7 GB: 1-of-7   9.9 GB: 2-of-7   9.9 GB: 3-of-7   9.8 GB: 4-of-7   9.9 GB: 5-of-7   9.9 GB: 6-of-7   7.4 GB: 7-of-7
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size32256
Torch Data Typebfloat16

Quantized Models of the Deepseek Coder 33B Base

Model
Likes
Downloads
VRAM
Deepseek Coder 33B Base GGUF8154514 GB
Deepseek Coder 33B Base GPTQ24317 GB
Deepseek Coder 33B Base AWQ41118 GB

Best Alternatives to Deepseek Coder 33B Base

Best Alternatives
Context / RAM
Downloads
Likes
ReflectionCoder DS 33B16K / 67 GB41624
Deepseek Wizard 33B Slerp16K / 35.3 GB80
Deepseek Coder 33B Instruct16K / 66.5 GB14146482
WhiteRabbitNeo 33B V116K / 67 GB128684
ValidateAI 3 33B Ties16K / 66.5 GB70
ValidateAI 2 33B AT16K / 66.5 GB100
Everyone Coder 33B Base16K / 66.5 GB5518
Fortran2Cpp16K / 67.3 GB73
F2C Translator16K / 67.3 GB51
Llm4decompile 33B16K / 66.5 GB127
Note: green Score (e.g. "73.2") means that the model is better than deepseek-ai/deepseek-coder-33b-base.

Rank the Deepseek Coder 33B Base Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227