CodeQwen1.5 7B EXL2 8.0bpw by Dracones

 ยป  All LLMs  ยป  Dracones  ยป  CodeQwen1.5 7B EXL2 8.0bpw   URL Share it on

  8-bit   Autotrain compatible Base model:quantized:qwen/code...   Base model:qwen/codeqwen1.5-7b   Conversational   En   Endpoints compatible   Exl2   Pretrained   Quantized   Qwen2   Region:us   Safetensors

CodeQwen1.5 7B EXL2 8.0bpw Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
CodeQwen1.5 7B EXL2 8.0bpw (Dracones/CodeQwen1.5-7B_exl2_8.0bpw)

CodeQwen1.5 7B EXL2 8.0bpw Parameters and Internals

Additional Notes 
These quants were made with exllamav2 version 0.0.18. Quants made on this version of EXL2 may not work on older versions of the exllamav2 library. If you have problems loading these models, please update Text Generation WebUI to the latest version.
LLM NameCodeQwen1.5 7B EXL2 8.0bpw
Repository ๐Ÿค—https://huggingface.co/Dracones/CodeQwen1.5-7B_exl2_8.0bpw 
Base Model(s)  Qwen/CodeQwen1.5-7B   Qwen/CodeQwen1.5-7B
Model Size7b
Required VRAM7.5 GB
Updated2025-02-22
MaintainerDracones
Model Typeqwen2
Model Files  7.5 GB
Supported Languagesen
Quantization Typeexl2
Model ArchitectureQwen2ForCausalLM
Licenseother
Context Length65536
Model Max Length65536
Transformers Version4.39.3
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<fim_pad>
Vocabulary Size92416
Torch Data Typebfloat16

Best Alternatives to CodeQwen1.5 7B EXL2 8.0bpw

Best Alternatives
Context / RAM
Downloads
Likes
Qwen2.5 7B Instruct 1M 4bit986K / 4.3 GB7306
...B Instruct 1M Unsloth Bnb 4bit986K / 7.5 GB3541
...still Qwen 7B Unsloth Bnb 4bit128K / 8.5 GB529809
...ek R1 Distill Qwen 7B Bnb 4bit128K / 5.5 GB26823
Mini Pathfinder128K / 15.2 GB150
CogitoDistil128K / 15.2 GB280
A1 V002128K / 15.2 GB300
A1 V0.0.1128K / 15.2 GB70
Qwen2.5 7B Bnb 4bit128K / 5.5 GB191674
Atlas Flash 7B Preview128K / 15.2 GB1173
Note: green Score (e.g. "73.2") means that the model is better than Dracones/CodeQwen1.5-7B_exl2_8.0bpw.

Rank the CodeQwen1.5 7B EXL2 8.0bpw Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227