CohereForAI Aya 23 35B 5 0bpw EXL2 by Zoyd

 ยป  All LLMs  ยป  Zoyd  ยป  CohereForAI Aya 23 35B 5 0bpw EXL2   URL Share it on

  5-bit   Ar   Autotrain compatible   Cohere   Conversational   Cs   De   El   En   Endpoints compatible   Es   Exl2   Fa   Fr   He   Hi   Id   It   Ja   Ko   Nl   Pl   Pt   Quantized   Region:us   Ro   Ru   Safetensors   Sharded   Tensorflow   Tr   Uk   Vi   Zh

CohereForAI Aya 23 35B 5 0bpw EXL2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
CohereForAI Aya 23 35B 5 0bpw EXL2 (Zoyd/CohereForAI_aya-23-35B-5_0bpw_exl2)

CohereForAI Aya 23 35B 5 0bpw EXL2 Parameters and Internals

Model Type 
auto-regressive, multilingual
Use Cases 
Areas:
Multilingual research, Natural language processing
Applications:
Text generation, Instruction following
Primary Use Cases:
Multilingual communication applications
Considerations:
Use restricted by license conditions
Additional Notes 
Model optimized for multilingual capabilities in 23 languages.
Supported Languages 
Arabic (supported), Chinese (supported), Czech (supported), Dutch (supported), English (supported), French (supported), German (supported), Greek (supported), Hebrew (supported), Hindi (supported), Indonesian (supported), Italian (supported), Japanese (supported), Korean (supported), Persian (supported), Polish (supported), Portuguese (supported), Romanian (supported), Russian (supported), Spanish (supported), Turkish (supported), Ukrainian (supported), Vietnamese (supported)
Training Details 
Data Sources:
Command family models, Aya Collection
Methodology:
instruction fine-tuned
Context Length:
8192
Model Architecture:
optimized transformer
Input Output 
Input Format:
Text
Accepted Modalities:
Text
Output Format:
Text
LLM NameCohereForAI Aya 23 35B 5 0bpw EXL2
Repository ๐Ÿค—https://huggingface.co/Zoyd/CohereForAI_aya-23-35B-5_0bpw_exl2 
Model Size35b
Required VRAM27.7 GB
Updated2025-02-22
MaintainerZoyd
Model Typecohere
Model Files  8.6 GB: 1-of-4   8.5 GB: 2-of-4   8.6 GB: 3-of-4   2.0 GB: 4-of-4
Supported Languagesen fr de es it pt ja ko zh ar el fa pl id cs he hi nl ro ru tr uk vi
Quantization Typeexl2
Model ArchitectureCohereForCausalLM
Licensecc-by-nc-4.0
Context Length8192
Model Max Length8192
Transformers Version4.41.0.dev0
Tokenizer ClassCohereTokenizer
Padding Token<PAD>
Vocabulary Size256000
Torch Data Typefloat16

Best Alternatives to CohereForAI Aya 23 35B 5 0bpw EXL2

Best Alternatives
Context / RAM
Downloads
Likes
Aya 23 35B 8bit8K / 36.9 GB672
Aya 23 35B 4bit8K / 19.6 GB111
Aya 23 35B 8.0bpw H8 EXL28K / 39.2 GB62
...reForAI Aya 23 35B 4 0bpw EXL28K / 23.3 GB51
Aya 23 35B 4.0bpw H6 EXL28K / 23.4 GB51
Aya 23 35B 5.0bpw H6 EXL28K / 27.8 GB51
Aya 23 35B 6.0bpw H6 EXL28K / 32.1 GB60
Aya 23 35B 3.0bpw H6 EXL28K / 19 GB50
Aya 23 35B 4.65bpw H6 EXL28K / 26.1 GB50
C4ai Command R V01 8bit8K / 36.8 GB61
Note: green Score (e.g. "73.2") means that the model is better than Zoyd/CohereForAI_aya-23-35B-5_0bpw_exl2.

Rank the CohereForAI Aya 23 35B 5 0bpw EXL2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227