LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Deepseek Coder 6.7B Instruct GPTQ by TheBloke

What open-source LLMs or SLMs are you in search of? 18870 in total.

  4-bit   Autotrain compatible Base model:deepseek-ai/deepsee...   Codegen   Conversational   Gptq   Instruct   License:other   Llama   Quantized   Region:us   Safetensors

Deepseek Coder 6.7B Instruct GPTQ Benchmarks

Rank the Deepseek Coder 6.7B Instruct GPTQ Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Deepseek Coder 6.7B Instruct GPTQ (TheBloke/deepseek-coder-6.7B-instruct-GPTQ)

Best Alternatives to Deepseek Coder 6.7B Instruct GPTQ

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...k Coder 6.7 Evol Feedback 4bit16K / 3.9 GB70
...oder 6.7B Instruct Hf 4bit Mlx16K / 4 GB50
...rpreter DS 6.7B 6.0bpw H6 EXL216K / 5.2 GB01
...rpreter DS 6.7B 8.0bpw H8 EXL216K / 6.9 GB01
Test16K / 0.3 GB270
...epseek Coder 6.7B Instruct AWQ16K / 3.9 GB94911
OpenCodeInterpreter DS 6.7B16K / 13.5 GB072
NaturalSQL 6.7B V016K / 13.5 GB1035
Speechless Coder Ds 6.7B16K / 13.5 GB19914
Code 290K 6.7B Instruct16K / 13.5 GB03

Deepseek Coder 6.7B Instruct GPTQ Parameters and Internals

LLM NameDeepseek Coder 6.7B Instruct GPTQ
RepositoryOpen on 🤗 
Model NameDeepseek Coder 6.7B Instruct
Model CreatorDeepSeek
Base Model(s)  Deepseek Coder 6.7B Instruct   deepseek-ai/deepseek-coder-6.7b-instruct
Model Size6.7b
Required VRAM3.9 GB
Updated2024-02-29
MaintainerTheBloke
Model Typedeepseek
Instruction-BasedYes
Model Files  3.9 GB
GPTQ QuantizationYes
Quantization Typegptq
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.35.0
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|EOT|>
Vocabulary Size32256
Initializer Range0.02
Torch Data Typebfloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003