WizardCoder Python 34B V1.0 AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  WizardCoder Python 34B V1.0 AWQ   URL Share it on

  Arxiv:2303.08774   Arxiv:2304.12244   Arxiv:2306.08568   Arxiv:2308.09583   4-bit   Autotrain compatible   Awq Base model:quantized:wizardlmt... Base model:wizardlmteam/wizard...   Code   Codegen   Llama   Model-index   Quantized   Region:us   Safetensors   Sharded   Tensorflow

WizardCoder Python 34B V1.0 AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
WizardCoder Python 34B V1.0 AWQ (TheBloke/WizardCoder-Python-34B-V1.0-AWQ)

WizardCoder Python 34B V1.0 AWQ Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
research, commercial applications
Applications:
code generation, text generation
Primary Use Cases:
enhanced code generation for Python
Additional Notes 
The model utilizes advanced AWQ quantization for improved deployment efficiency.
Supported Languages 
or_languages (Python)
Training Details 
Data Sources:
Evol Instruct Code
Data Volume:
80k
Methodology:
Low-bit weight quantization (AWQ)
Context Length:
4096
Hardware Used:
1 x 48GB GPU, 2 x 80GB GPU
Model Architecture:
Transformer-based
Input Output 
Input Format:
Instruction followed by a prompt and response format.
Accepted Modalities:
text
Output Format:
Textual code and task responses.
Performance Tips:
Use with proper quantization parameters for optimal performance.
Release Notes 
Version:
1.0
Date:
2023/08/26
Notes:
Initial release of WizardCoder-Python-34B-V1.0 with AWQ quantization.
LLM NameWizardCoder Python 34B V1.0 AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/WizardCoder-Python-34B-V1.0-AWQ 
Model CreatorWizardLM
Base Model(s)  WizardLM/WizardCoder-Python-34B-V1.0   WizardLM/WizardCoder-Python-34B-V1.0
Model Size34b
Required VRAM18.3 GB
Updated2024-12-22
MaintainerTheBloke
Model Typellama
Model Files  9.9 GB: 1-of-2   8.4 GB: 2-of-2
AWQ QuantizationYes
Quantization Typeawq
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length16384
Model Max Length16384
Transformers Version4.31.0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32001
Torch Data Typefloat16

Best Alternatives to WizardCoder Python 34B V1.0 AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Phind CodeLlama 34B V2 AWQ16K / 18.3 GB18132
...echless Codellama 34B V2.0 AWQ16K / 18.3 GB246
...enbuddy Coder 34B V11 Bf16 AWQ16K / 18.5 GB321
CodeLlama 34B Instruct AWQ16K / 18.3 GB462
Synthia 34B V1.2 AWQ16K / 18.3 GB510
...nd CodeLlama 34B Python V1 AWQ16K / 18.3 GB272
MAmmoTH Coder 34B AWQ16K / 18.3 GB211
CodeFuse CodeLlama 34B AWQ16K / 18.3 GB242
CodeLlama 34B AWQ16K / 18.3 GB232
CodeLlama 34B Python AWQ16K / 18.3 GB222
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/WizardCoder-Python-34B-V1.0-AWQ.

Rank the WizardCoder Python 34B V1.0 AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217