Codellama CodeLlama 13B Python Hf W4 G128 AWQ by abhinavkulkarni

 ยป  All LLMs  ยป  abhinavkulkarni  ยป  Codellama CodeLlama 13B Python Hf W4 G128 AWQ   URL Share it on

  Arxiv:2308.12950   Autotrain compatible   Awq   Code   Codegen   Llama   Llama2   Pytorch   Quantized   Region:us   Sharded

Codellama CodeLlama 13B Python Hf W4 G128 AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Codellama CodeLlama 13B Python Hf W4 G128 AWQ (abhinavkulkarni/codellama-CodeLlama-13b-Python-hf-w4-g128-awq)

Codellama CodeLlama 13B Python Hf W4 G128 AWQ Parameters and Internals

Model Type 
generative, text models
Training Details 
Methodology:
AWQ quantization
Input Output 
Accepted Modalities:
text
LLM NameCodellama CodeLlama 13B Python Hf W4 G128 AWQ
Repository ๐Ÿค—https://huggingface.co/abhinavkulkarni/codellama-CodeLlama-13b-Python-hf-w4-g128-awq 
Model Size13b
Required VRAM7.2 GB
Updated2024-12-22
Maintainerabhinavkulkarni
Model Typellama
Model Files  3.2 GB: 1-of-3   3.2 GB: 2-of-3   0.8 GB: 3-of-3
Supported Languagescode
AWQ QuantizationYes
Quantization Typeawq
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length16384
Model Max Length16384
Transformers Version4.33.1
Tokenizer ClassCodeLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Codellama CodeLlama 13B Python Hf W4 G128 AWQ

Best Alternatives
Context / RAM
Downloads
Likes
...th CodeLlama 13B Python Hf AWQ16K / 7.5 GB70
Ramgpt 13B AWQ Gemm16K / 7.2 GB01
NexusRaven 13B AWQ16K / 7.2 GB354
CodeLlama 13B Instruct AWQ16K / 7.2 GB639
MAmmoTH Coder 13B AWQ16K / 7.2 GB371
CodeLlama 13B AWQ16K / 7.2 GB624
...odeLlama 13B Oasst Sft V10 AWQ16K / 7.2 GB251
CodeLlama 13B Python AWQ16K / 7.2 GB252
...ma 13B Instruct Hf W4 G128 AWQ16K / 7.2 GB330
WhiteRabbitNeo 13B V116K / 26 GB1767404
Note: green Score (e.g. "73.2") means that the model is better than abhinavkulkarni/codellama-CodeLlama-13b-Python-hf-w4-g128-awq.

Rank the Codellama CodeLlama 13B Python Hf W4 G128 AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217