LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Rose 20B 3bpw EXL2 by Kooten

What open-source LLMs or SLMs are you in search of? 18857 in total.

 ยป  All LLMs  ยป  Kooten  ยป  Rose 20B 3bpw EXL2   URL Share it on

  Autotrain compatible   Endpoints compatible   Exl2   License:cc-by-nc-4.0   Llama   Quantized   Region:us   Safetensors

Rank the Rose 20B 3bpw EXL2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Rose 20B 3bpw EXL2 (Kooten/Rose-20B-3bpw-exl2)

Best Alternatives to Rose 20B 3bpw EXL2

Best Alternatives
HF Rank
Deita 20B71.432K / 39.8 GB10700
Internlm2 20B Llama70.6132K / 39.6 GB217413
Internlm 20B Llama65.094K / 40.3 GB31670
Internlm2 Base 20B Llama62.6932K / 39.6 GB13092
Internlm2 Base 20B Llama62.6932K / 39.6 GB11900
Internlm2 Chat 20B Llama62.5632K / 39.6 GB20973
Iambe 20B DARE V261.994K / 39.9 GB23536
Deacon 20B61.284K / 40.2 GB26770
Stellaris Internlm2 20B R51260.4632K / 39.8 GB18242
MXLewd L2 20B57.434K / 40.7 GB239414
Note: green Score (e.g. "73.2") means that the model is better than Kooten/Rose-20B-3bpw-exl2.

Rose 20B 3bpw EXL2 Parameters and Internals

LLM NameRose 20B 3bpw EXL2
RepositoryOpen on ๐Ÿค— 
Model Size20b
Required VRAM7.9 GB
Model Typellama
Model Files  7.9 GB
Quantization Typeexl2
Model ArchitectureLlamaForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.36.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token[PAD]
Vocabulary Size32000
Initializer Range0.02
Torch Data Typefloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003