Openbuddy Yi1.5 34B V21.6 32K Fp16 by OpenBuddy

 ยป  All LLMs  ยป  OpenBuddy  ยป  Openbuddy Yi1.5 34B V21.6 32K Fp16   URL Share it on

  Autotrain compatible   Conversational   De   En   Fi   Fp16   Fr   It   Ja   Ko   Llama   Mixtral   Quantized   Region:us   Ru   Safetensors   Sharded   Tensorflow   Zh

Openbuddy Yi1.5 34B V21.6 32K Fp16 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Openbuddy Yi1.5 34B V21.6 32K Fp16 (OpenBuddy/openbuddy-yi1.5-34b-v21.6-32k-fp16)

Openbuddy Yi1.5 34B V21.6 32K Fp16 Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
Research, Chatbot development
Applications:
Multilingual chat applications
Primary Use Cases:
Multilingual text generation, Conversational agents
Limitations:
Cannot be used in critical or high-risk situations, May produce undesirable outputs
Considerations:
Users should avoid using this model in scenarios where errors could lead to significant harm.
Additional Notes 
Prompt format is defined in tokenizer_config.json. Can be used to deploy OpenAI-like API service using vllm.
Supported Languages 
zh (Full proficiency), en (Full proficiency), fr (Full proficiency), de (Full proficiency), ja (Full proficiency), ko (Full proficiency), it (Full proficiency), ru (Full proficiency), fi (Full proficiency)
Input Output 
Input Format:
Prompt format using special tokens like <|role|>, <|says|>, <|end|>
Accepted Modalities:
text
Output Format:
Generated text
Performance Tips:
Ensure usage of the fast tokenizer from transformers.
LLM NameOpenbuddy Yi1.5 34B V21.6 32K Fp16
Repository ๐Ÿค—https://huggingface.co/OpenBuddy/openbuddy-yi1.5-34b-v21.6-32k-fp16 
Model Size34b
Required VRAM69.2 GB
Updated2025-06-01
MaintainerOpenBuddy
Model Typellama
Model Files  4.8 GB: 1-of-15   4.8 GB: 2-of-15   5.0 GB: 3-of-15   4.8 GB: 4-of-15   4.8 GB: 5-of-15   5.0 GB: 6-of-15   4.8 GB: 7-of-15   4.8 GB: 8-of-15   5.0 GB: 9-of-15   4.8 GB: 10-of-15   4.8 GB: 11-of-15   5.0 GB: 12-of-15   4.8 GB: 13-of-15   4.8 GB: 14-of-15   1.2 GB: 15-of-15
Supported Languageszh en fr de ja ko it ru fi
Quantization Typefp16
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length131072
Model Max Length131072
Transformers Version4.42.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<|pad0|>
Vocabulary Size64256
Torch Data Typefloat16

Best Alternatives to Openbuddy Yi1.5 34B V21.6 32K Fp16

Best Alternatives
Context / RAM
Downloads
Likes
...34B 200K Aezakmi Raw 1902 EXL2195K / 20.7 GB151
Yi 34B 200K MAR2024 EXL2 4bpw195K / 18 GB221
...mi Raw Toxic 2702 4.65bpw EXL2195K / 20.8 GB212
...B 200K RPMerge 4.65bpw H6 EXL2195K / 10.5 GB111
Yi 34B 200K RPMerge195K / 68.9 GB1660
Opus V1 34B 8.0bpw H8 EXL2195K / 34.8 GB211
Opus V1 34B 5.0bpw H6 EXL2195K / 22.3 GB152
Opus V1 34B 4.0bpw H6 EXL2195K / 18 GB141
34B Beta 5.0bpw H6 EXL2195K / 22.3 GB191
34B Beta 4.0bpw H6 EXL2195K / 18.1 GB121

Rank the Openbuddy Yi1.5 34B V21.6 32K Fp16 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 47770 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227