CodeFuse DeepSeek 33B 4bits by codefuse-ai

 »  All LLMs  »  codefuse-ai  »  CodeFuse DeepSeek 33B 4bits   URL Share it on

  4bit   Autotrain compatible   Codegen   Endpoints compatible   Gptq   Llama   Pytorch   Quantized   Region:us

CodeFuse DeepSeek 33B 4bits Benchmarks

CodeFuse DeepSeek 33B 4bits (codefuse-ai/CodeFuse-DeepSeek-33B-4bits)

CodeFuse DeepSeek 33B 4bits Parameters and Internals

Model Type 
text-generation
Training Details 
Methodology:
Finetuned by QLoRA on multiple code-related tasks
Input Output 
Input Format:
Concatenated string formed by combining conversation data in training data format
Accepted Modalities:
text
Release Notes 
Version:
2024-01-12
Date:
2024-01-12
Notes:
CodeFuse-DeepSeek-33B-4bits released. Achieved 78.05% accuracy on HumanEval pass@1.
Version:
2024-01-12
Date:
2024-01-12
Notes:
CodeFuse-DeepSeek-33B released, achieving 78.65% score on HumanEval.
Version:
2023-11-10
Date:
2023-11-10
Notes:
CodeFuse-CodeGeeX2-6B released with HumanEval score of 45.12%.
Version:
2023-10-20
Date:
2023-10-20
Notes:
CodeFuse-QWen-14B technical documentation released.
Version:
2023-10-16
Date:
2023-10-16
Notes:
CodeFuse-QWen-14B released with HumanEval score of 48.78%.
Version:
2023-09-27
Date:
2023-09-27
Notes:
CodeFuse-StarCoder-15B released with HumanEval score of 54.9%.
Version:
2023-09-26
Date:
2023-09-26
Notes:
4-bit quantized CodeFuse-CodeLlama-34B released with 73.8% HumanEval score.
Version:
2023-09-11
Date:
2023-09-11
Notes:
CodeFuse-CodeLlama34B achieved 74.4% on HumanEval, open-source SOTA.
LLM NameCodeFuse DeepSeek 33B 4bits
Repository 🤗https://huggingface.co/codefuse-ai/CodeFuse-DeepSeek-33B-4bits 
Base Model(s)  CodeFuse DeepSeek 33B   codefuse-ai/CodeFuse-DeepSeek-33B
Model Size33b
Required VRAM18.7 GB
Updated2025-02-05
Maintainercodefuse-ai
Model Typellama
Model Files  18.7 GB
GPTQ QuantizationYes
Quantization Typegptq|4bit
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.33.2
Tokenizer ClassLlamaTokenizerFast
Beginning of Sentence Token<|begin▁of▁sentence|>
End of Sentence Token<|end▁of▁sentence|>
Padding Token<|end▁of▁sentence|>
Vocabulary Size32256
Torch Data Typebfloat16

Best Alternatives to CodeFuse DeepSeek 33B 4bits

Best Alternatives
Context / RAM
Downloads
Likes
Everyone Coder 33B Base GPTQ16K / 17.4 GB183
...epseek Coder 33B Instruct GPTQ16K / 17.4 GB23026
Deepseek Coder 33B Base GPTQ16K / 17.4 GB432
Vicuna 33B Coder GPTQ2K / 16.9 GB181
...erpreter DS 33B 4.0bpw H6 EXL216K / 17.1 GB54
...erpreter DS 33B 8.0bpw H8 EXL216K / 33.5 GB52
...rpreter DS 33B 4.65bpw H6 EXL216K / 19.8 GB42
...erpreter DS 33B 5.0bpw H6 EXL216K / 21.2 GB51
...erpreter DS 33B 6.0bpw H6 EXL216K / 25.3 GB51
...der 33B V2 Base 8.0bpw H8 EXL216K / 33.5 GB31
Note: green Score (e.g. "73.2") means that the model is better than codefuse-ai/CodeFuse-DeepSeek-33B-4bits.

Rank the CodeFuse DeepSeek 33B 4bits Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42625 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227