ChatAllInOne Yi 34B 200K V1 GGUF by second-state

 ยป  All LLMs  ยป  second-state  ยป  ChatAllInOne Yi 34B 200K V1 GGUF   URL Share it on

  Autotrain compatible Base model:drnicefellow/chatal... Base model:quantized:drnicefel...   Endpoints compatible   Gguf   Llama   Q2   Quantized   Region:us

ChatAllInOne Yi 34B 200K V1 GGUF Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

ChatAllInOne Yi 34B 200K V1 GGUF Parameters and Internals

Model Type 
text-generation
Additional Notes 
The model is quantized by Second State Inc. with various GGUF quantization methods, offering different tradeoffs between model size and quality loss.
LLM NameChatAllInOne Yi 34B 200K V1 GGUF
Repository ๐Ÿค—https://huggingface.co/second-state/ChatAllInOne-Yi-34B-200K-V1-GGUF 
Model NameChatAllInOne-Yi-34B-200K-V1
Model CreatorDrNicefellow
Base Model(s)  ChatAllInOne Yi 34B 200K V1   DrNicefellow/ChatAllInOne-Yi-34B-200K-V1
Model Size34b
Required VRAM12.8 GB
Updated2024-11-21
Maintainersecond-state
Model Typellama
Model Files  12.8 GB   18.1 GB   16.7 GB   15.0 GB   19.5 GB   20.7 GB   19.6 GB   23.7 GB   24.3 GB   23.7 GB   28.2 GB   36.5 GB
GGUF QuantizationYes
Quantization Typegguf|q2|q4_k|q5_k
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length200000
Model Max Length200000
Transformers Version4.37.0
Vocabulary Size64000
Torch Data Typebfloat16
ChatAllInOne Yi 34B 200K V1 GGUF (second-state/ChatAllInOne-Yi-34B-200K-V1-GGUF)

Best Alternatives to ChatAllInOne Yi 34B 200K V1 GGUF

Best Alternatives
Context / RAM
Downloads
Likes
...Yi Ties 34B V1.0 MLX Q8 0.gguf195K / 36.5 GB70
Dolphin 2.2 Yi 34B GGUF16K / 12.8 GB1011
Yi 1.5 34B Chat GGUF4K / 8.9 GB3205
...ionStar Yi 34B Chat Llama GGUF4K / 12.8 GB3122
Yi 34B Chat GGUF4K / 12.8 GB2063
...mantha 1.11 CodeLlama 34B GGUF2K / 12.5 GB1252
Yi 34B 200K RPMerge195K / 68.9 GB54660
...34B 200K Aezakmi Raw 1902 EXL2195K / 20.7 GB171
Yi 34B 200K MAR2024 EXL2 4bpw195K / 18 GB51
...mi Raw Toxic 2702 4.65bpw EXL2195K / 20.8 GB152
Note: green Score (e.g. "73.2") means that the model is better than second-state/ChatAllInOne-Yi-34B-200K-V1-GGUF.

Rank the ChatAllInOne Yi 34B 200K V1 GGUF Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 38149 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241110