Deepseek Coder 1.3B Base by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  Deepseek Coder 1.3B Base   URL Share it on

  Autotrain compatible   Codegen   Endpoints compatible   Llama   Pytorch   Region:us

Deepseek Coder 1.3B Base Benchmarks

Deepseek Coder 1.3B Base (deepseek-ai/deepseek-coder-1.3b-base)

Deepseek Coder 1.3B Base Parameters and Internals

Model Type 
code completion, state-of-the-art
Use Cases 
Areas:
research, commercial applications
Applications:
code completion, code inference
Supported Languages 
English (high), Chinese (high)
Training Details 
Data Sources:
87% code, 13% natural language
Data Volume:
2 trillion tokens
Methodology:
Multi-Head Attention, Project-level code corpus, Window size 16K, Fill-in-the-blank task
Context Length:
16000
Model Architecture:
Multi-Head Attention
Input Output 
Accepted Modalities:
code
LLM NameDeepseek Coder 1.3B Base
Repository 🤗https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-base 
Model Size1.3b
Required VRAM2.7 GB
Updated2025-02-05
Maintainerdeepseek-ai
Model Typellama
Model Files  2.7 GB
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size32256
Torch Data Typebfloat16

Quantized Models of the Deepseek Coder 1.3B Base

Model
Likes
Downloads
VRAM
Deepseek Coder 1.3B Base GGUF615700 GB
Deepseek Coder 1.3B Base GPTQ2870 GB
Deepseek Coder 1.3B Base AWQ0510 GB

Best Alternatives to Deepseek Coder 1.3B Base

Best Alternatives
Context / RAM
Downloads
Likes
Deepseek Coder 1.3B Instruct16K / 2.7 GB34556107
...c Deepseek Coder 1.3B Instruct16K / 5.4 GB1320
CursorCore DS 1.3B LC16K / 2.7 GB1200
CursorCore DS 1.3B SR16K / 2.7 GB1200
CursorCore DS 1.3B16K / 2.7 GB1180
Llm4decompile 1.3B V216K / 2.7 GB4386
Speechless Coder Ds 1.3B16K / 2.7 GB12430
Hpc Coder V2.1.3B16K / 2.7 GB984
...1.3B Chat And Function Calling16K / 2.7 GB2380
Deepseek Coder 1.3B Chat16K / 2.7 GB981
Note: green Score (e.g. "73.2") means that the model is better than deepseek-ai/deepseek-coder-1.3b-base.

Rank the Deepseek Coder 1.3B Base Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227