C4ai Command R Plus EXL2 5.5bpw by Dracones

 ยป  All LLMs  ยป  Dracones  ยป  C4ai Command R Plus EXL2 5.5bpw   URL Share it on

  Ar   Autotrain compatible   Cohere   De   En   Endpoints compatible   Es   Exl2   Fr   It   Ja   Ko   Pt   Quantized   Region:us   Safetensors   Sharded   Tensorflow   Zh

C4ai Command R Plus EXL2 5.5bpw Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
C4ai Command R Plus EXL2 5.5bpw (Dracones/c4ai-command-r-plus_exl2_5.5bpw)

C4ai Command R Plus EXL2 5.5bpw Parameters and Internals

Model Type 
text generation
Additional Notes 
Quants made on a specific version of EXL2 may not work on older versions of the exllamav2 library.
Supported Languages 
en (English), fr (French), de (German), es (Spanish), it (Italian), pt (Portuguese), ja (Japanese), ko (Korean), zh (Chinese), ar (Arabic)
Input Output 
Accepted Modalities:
text
Performance Tips:
If you encounter issues loading models, update the Text Generation WebUI.
LLM NameC4ai Command R Plus EXL2 5.5bpw
Repository ๐Ÿค—https://huggingface.co/Dracones/c4ai-command-r-plus_exl2_5.5bpw 
Required VRAM80.2 GB
Updated2025-02-22
MaintainerDracones
Model Typecohere
Model Files  8.6 GB: 1-of-10   8.6 GB: 2-of-10   8.5 GB: 3-of-10   8.4 GB: 4-of-10   8.4 GB: 5-of-10   8.4 GB: 6-of-10   8.4 GB: 7-of-10   8.3 GB: 8-of-10   8.6 GB: 9-of-10   4.0 GB: 10-of-10
Supported Languagesen fr de es it pt ja ko zh ar
Quantization Typeexl2
Model ArchitectureCohereForCausalLM
Licensecc-by-nc-4.0
Context Length8192
Model Max Length8192
Transformers Version4.40.0.dev0
Tokenizer ClassCohereTokenizerFast
Padding Token<PAD>
Vocabulary Size256000
Torch Data Typefloat16

Best Alternatives to C4ai Command R Plus EXL2 5.5bpw

Best Alternatives
Context / RAM
Downloads
Likes
...ai Command R Plus EXL2 2.75bpw8K / 44.6 GB113
...ai Command R Plus EXL2 2.25bpw8K / 38 GB121
...4ai Command R Plus EXL2 8.0bpw8K / 109.8 GB52
C4ai Command R V01 EXL2 6.0bpw8K / 32.1 GB143
...4ai Command R Plus EXL2 6.5bpw8K / 93.1 GB50
...ommand R V01 EXL2 3.5bpw Rpcal8K / 21.2 GB72
C4ai Command R V01 EXL2 5.0bpw8K / 27.8 GB101
C4ai Command R V01 EXL2 3.5bpw8K / 21.1 GB61
...4ai Command R Plus 8.0bpw EXL28K / 110 GB31
...4ai Command R Plus 6.0bpw EXL28K / 86.8 GB51
Note: green Score (e.g. "73.2") means that the model is better than Dracones/c4ai-command-r-plus_exl2_5.5bpw.

Rank the C4ai Command R Plus EXL2 5.5bpw Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227