WizardLM 1.0 Uncensored CodeLlama 34B AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  WizardLM 1.0 Uncensored CodeLlama 34B AWQ   URL Share it on

  Merged Model   4-bit   Autotrain compatible   Awq Base model:cognitivecomputatio... Base model:quantized:cognitive...   Codegen Dataset:ehartford/wizardlm evo...   En   Instruct   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow   Uncensored

WizardLM 1.0 Uncensored CodeLlama 34B AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
WizardLM 1.0 Uncensored CodeLlama 34B AWQ (TheBloke/WizardLM-1.0-Uncensored-CodeLlama-34B-AWQ)

WizardLM 1.0 Uncensored CodeLlama 34B AWQ Parameters and Internals

Model Type 
llama
Additional Notes 
This model is retrained on a filtered dataset to reduce refusals, avoidance, and bias and uses Vicuna-1.1 style prompts. It maintains coding abilities from CodeLlama-34b.
Input Output 
Input Format:
You are a helpful AI assistant. USER: {prompt} ASSISTANT:
LLM NameWizardLM 1.0 Uncensored CodeLlama 34B AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/WizardLM-1.0-Uncensored-CodeLlama-34B-AWQ 
Model NameWizardLM 1.0 Uncensored CodeLlama 34B
Model CreatorEric Hartford
Base Model(s)  ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b   ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
Merged ModelYes
Model Size34b
Required VRAM18.3 GB
Updated2024-12-22
MaintainerTheBloke
Model Typellama
Instruction-BasedYes
Model Files  9.9 GB: 1-of-2   8.4 GB: 2-of-2
Supported Languagesen
AWQ QuantizationYes
Quantization Typeawq
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length16384
Model Max Length16384
Transformers Version4.31.0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to WizardLM 1.0 Uncensored CodeLlama 34B AWQ

Best Alternatives
Context / RAM
Downloads
Likes
...echless Codellama 34B V2.0 AWQ16K / 18.3 GB246
CodeLlama 34B Instruct AWQ16K / 18.3 GB462
CodeLlama 34B Instruct Fp1616K / 67.5 GB19847
...M 1.0 Uncensored CodeLlama 34B16K / 67.5 GB2127
... Uncensored CodeLlama 34B GPTQ16K / 17.7 GB317
...gpt 32K Codellama 34B Instruct32K / 67.5 GB4812
CodeLlama 34B Instruct Hf16K / 67.5 GB45712280
Speechless Codellama 34B V2.016K / 67.5 GB116617
CodeLlama 34B Instruct Hf16K / 67.5 GB212214
Speechless Codellama 34B V1.916K / 67.5 GB11660
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/WizardLM-1.0-Uncensored-CodeLlama-34B-AWQ.

Rank the WizardLM 1.0 Uncensored CodeLlama 34B AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40123 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217