WhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B by win10

 ยป  All LLMs  ยป  win10  ยป  WhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B   URL Share it on

  Merged Model   Autotrain compatible Base model:finetune:whiterabbi... Base model:whiterabbitneo/whit...   Codegen   Conversational   Endpoints compatible   Qwen2   Region:us   Safetensors   Sharded   Tensorflow

WhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
WhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B (win10/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-12.3B)

WhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B Parameters and Internals

LLM NameWhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B
Repository ๐Ÿค—https://huggingface.co/win10/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-12.3B 
Base Model(s)  WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B   WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B
Merged ModelYes
Model Size7b
Required VRAM21.5 GB
Updated2025-03-24
Maintainerwin10
Model Typeqwen2
Model Files  1.1 GB: 1-of-51   1.1 GB: 2-of-51   0.5 GB: 3-of-51   0.5 GB: 4-of-51   0.4 GB: 5-of-51   0.5 GB: 6-of-51   0.4 GB: 7-of-51   0.5 GB: 8-of-51   0.4 GB: 9-of-51   0.5 GB: 10-of-51   0.4 GB: 11-of-51   0.5 GB: 12-of-51   0.4 GB: 13-of-51   0.5 GB: 14-of-51   0.4 GB: 15-of-51   0.5 GB: 16-of-51   0.4 GB: 17-of-51   0.5 GB: 18-of-51   0.4 GB: 19-of-51   0.5 GB: 20-of-51   0.4 GB: 21-of-51   0.5 GB: 22-of-51   0.4 GB: 23-of-51   0.5 GB: 24-of-51   0.5 GB: 25-of-51   0.4 GB: 26-of-51   0.5 GB: 27-of-51   0.4 GB: 28-of-51   0.5 GB: 29-of-51   0.4 GB: 30-of-51   0.5 GB: 31-of-51   0.4 GB: 32-of-51   0.5 GB: 33-of-51   0.5 GB: 34-of-51   0.5 GB: 35-of-51   0.5 GB: 36-of-51   0.5 GB: 37-of-51   0.5 GB: 38-of-51   0.4 GB: 39-of-51   0.5 GB: 40-of-51   0.4 GB: 41-of-51   0.5 GB: 42-of-51   0.4 GB: 43-of-51   0.5 GB: 44-of-51
Generates CodeYes
Model ArchitectureQwen2ForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.45.2
Vocabulary Size152064
Torch Data Typebfloat16

Best Alternatives to WhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B

Best Alternatives
Context / RAM
Downloads
Likes
Qwen2.5 7B Rebase986K / 15.2 GB112
SakalFusion 7B Beta986K / 15.2 GB90
Qwen2.5 7B Coder Codeio Pp128K / 15.2 GB565
...R1 Distill Qwen MFANN Slerp 7B128K / 15.2 GB130
Qwen2.5 7B CySecButler V0.1128K / 15.2 GB1123
CoT 2.5128K / 15.2 GB390
Mergekit Ties Uqhfast128K / 15.2 GB260
CoT 2.5128K / 15.2 GB260
Mergekit Ties Uqhfast128K / 15.2 GB130
StockQwen 2.5 7B128K / 15.2 GB163
Note: green Score (e.g. "73.2") means that the model is better than win10/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-12.3B.

Rank the WhiteRabbitNeo 2.5 Qwen 2.5 Coder 12.3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 45494 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227