Deepseek Coder 1.3B Base AWQ by TheBloke

 »  All LLMs  »  TheBloke  »  Deepseek Coder 1.3B Base AWQ   URL Share it on

  4-bit   Autotrain compatible   Awq Base model:deepseek-ai/deepsee... Base model:quantized:deepseek-...   Codegen   Llama   Quantized   Region:us   Safetensors

Deepseek Coder 1.3B Base AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Deepseek Coder 1.3B Base AWQ (TheBloke/deepseek-coder-1.3b-base-AWQ)

Deepseek Coder 1.3B Base AWQ Parameters and Internals

Model Type 
deepseek
Use Cases 
Areas:
Research, Commercial applications
Primary Use Cases:
Code completion, Code insertion, Project-level code understanding
Limitations:
Limited to code and related linguistic input processing
Considerations:
Ensure usage aligns with the code model's focus on programming tasks.
Additional Notes 
Part of a series of code language models starting from 1B to 33B versions. Model size of 1.3B for this variant.
Supported Languages 
English (fluent), Chinese (fluent)
Training Details 
Data Sources:
2T tokens including 87% code and 13% linguistic data
Data Volume:
2T tokens
Methodology:
Pre-trained with a window size of 16K and fill-in-the-blank task for code completion.
Context Length:
16384
Model Architecture:
Multi-Head Attention
Input Output 
Input Format:
{prompt}
Accepted Modalities:
text
Output Format:
Generated code or text completion
LLM NameDeepseek Coder 1.3B Base AWQ
Repository 🤗https://huggingface.co/TheBloke/deepseek-coder-1.3b-base-AWQ 
Model NameDeepseek Coder 1.3B Base
Model CreatorDeepSeek
Base Model(s)  deepseek-ai/deepseek-coder-1.3b-base   deepseek-ai/deepseek-coder-1.3b-base
Model Size1.3b
Required VRAM0.9 GB
Updated2025-02-05
MaintainerTheBloke
Model Typedeepseek
Model Files  0.9 GB
AWQ QuantizationYes
Quantization Typeawq
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.35.0
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Vocabulary Size32256
Torch Data Typefloat16

Best Alternatives to Deepseek Coder 1.3B Base AWQ

Best Alternatives
Context / RAM
Downloads
Likes
...epseek Coder 1.3B Instruct AWQ8K / 0.9 GB442
Deepseek Coder 1.3B Instruct16K / 2.7 GB34556107
...c Deepseek Coder 1.3B Instruct16K / 5.4 GB1320
CursorCore DS 1.3B LC16K / 2.7 GB1200
CursorCore DS 1.3B SR16K / 2.7 GB1200
CursorCore DS 1.3B16K / 2.7 GB1180
Llm4decompile 1.3B V216K / 2.7 GB4386
Speechless Coder Ds 1.3B16K / 2.7 GB12430
Deepseek Coder 1.3B Base16K / 2.7 GB5242577
Hpc Coder V2.1.3B16K / 2.7 GB984
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/deepseek-coder-1.3b-base-AWQ.

Rank the Deepseek Coder 1.3B Base AWQ Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227