Deepseek Coder 6.7B Instruct by deepseek-ai

 »  All LLMs  »  deepseek-ai  »  Deepseek Coder 6.7B Instruct   URL Share it on

  Autotrain compatible   Codegen   Conversational   Endpoints compatible   Instruct   License:other   Llama   Pytorch   Region:us   Safetensors   Sharded   Tensorflow

Deepseek Coder 6.7B Instruct Benchmarks

Rank the Deepseek Coder 6.7B Instruct Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Deepseek Coder 6.7B Instruct (deepseek-ai/deepseek-coder-6.7b-instruct)

Quantized Models of the Deepseek Coder 6.7B Instruct

Model
Likes
Downloads
VRAM
...pseek Coder 6.7B Instruct GGUF163113612 GB
...pseek Coder 6.7B Instruct GPTQ244103 GB
...epseek Coder 6.7B Instruct AWQ142873 GB
...pseek Coder 6.7B Instruct GGUF56872 GB

Best Alternatives to Deepseek Coder 6.7B Instruct

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...deLlama 7B Instruct Hf 8bits Q16K / 4.2 GB331
...CodeLlama 7B Instruct Hf 4bits16K / 4.2 GB200
DevPearl 7B Dare Ties16K / 13.4 GB4981
Codellama 7B Instruct Slerp16K / 13.4 GB951
CodeLlama 7B Instruct Hf16K / 13.5 GB74223187
StructLM 7B16K / 13.5 GB3220
...Japanese CodeLlama 7B Instruct16K / 13.5 GB131817
MathCoder CL 7B16K / 13.5 GB8017
Lloro16K / 13.5 GB417
CodeLlama 7B Instruct Solidity16K / 13.5 GB579

Deepseek Coder 6.7B Instruct Parameters and Internals

LLM NameDeepseek Coder 6.7B Instruct
RepositoryOpen on 🤗 
Model Size7b
Required VRAM13.5 GB
Updated2024-06-18
Maintainerdeepseek-ai
Model Typellama
Instruction-BasedYes
Model Files  10.0 GB: 1-of-2   3.5 GB: 2-of-2   10.0 GB: 1-of-2   3.5 GB: 2-of-2
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|EOT|>
Vocabulary Size32256
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35130 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801