Deepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2 by LoneStriker

 »  All LLMs  »  LoneStriker  »  Deepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2   URL Share it on

  Autotrain compatible   Codegen   Conversational   Endpoints compatible   Exl2   Instruct   Llama   Pytorch   Quantized   Region:us   Safetensors

Deepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Deepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2 (LoneStriker/deepseek-coder-6.7b-instruct-3.0bpw-h6-exl2-2)

Deepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2 Parameters and Internals

Model Type 
code generation, text generation
Supported Languages 
languages_supported (/languages_supported/), proficiency_levels (/proficiency_levels/)
Training Details 
Data Sources:
project-level code corpus
Data Volume:
2T tokens
Methodology:
fill-in-the-blank task, using a window size of 16K
Context Length:
16000
LLM NameDeepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2
Repository 🤗https://huggingface.co/LoneStriker/deepseek-coder-6.7b-instruct-3.0bpw-h6-exl2-2 
Model Size7b
Required VRAM2.8 GB
Updated2024-12-22
MaintainerLoneStriker
Model Typellama
Instruction-BasedYes
Model Files  2.8 GB
Quantization Typeexl2
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|EOT|>
Vocabulary Size32256
Torch Data Typebfloat16

Best Alternatives to Deepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2

Best Alternatives
Context / RAM
Downloads
Likes
...ruct Solidity Bnb 4bit Smashed16K / 4.2 GB140
...B Instruct Hf Bnb 4bit Smashed16K / 4.2 GB210
CodelLama7B Inst DPO 7K Mlx16K / 4.2 GB82
...eLlama 7B Instruct Hf 4bit MLX16K / 4.2 GB121
...6.7B Instruct 8.0bpw H8 EXL2 216K / 6.8 GB92
... 7B Instruct Nf4 Fp16 Upscaled16K / 13.5 GB150
CodeLlama 7B Instruct Fp1616K / 13.5 GB338
...Llama 7B Instruct Bf16 Sharded16K / 13.5 GB161
...B Instruct V1.5 6.0bpw H6 EXL24K / 5.7 GB72
...B Instruct V1.5 8.0bpw H8 EXL24K / 7.3 GB111
Note: green Score (e.g. "73.2") means that the model is better than LoneStriker/deepseek-coder-6.7b-instruct-3.0bpw-h6-exl2-2.

Rank the Deepseek Coder 6.7B Instruct 3.0bpw H6 EXL2 2 Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217