Openbuddy Yi1.5 34B V21.3 32K by OpenBuddy

 ยป  All LLMs  ยป  OpenBuddy  ยป  Openbuddy Yi1.5 34B V21.3 32K   URL Share it on

  Autotrain compatible   Conversational   De   En   Fi   Fr   It   Ja   Ko   Llama   Mixtral   Region:us   Ru   Safetensors   Sharded   Tensorflow   Zh

Openbuddy Yi1.5 34B V21.3 32K Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Openbuddy Yi1.5 34B V21.3 32K Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
Research, Chatbot development
Applications:
Multilingual chat applications
Primary Use Cases:
Multilingual text generation, Conversational agents
Limitations:
Cannot be used in critical or high-risk situations, May produce undesirable outputs
Considerations:
Users should avoid using this model in scenarios where errors could lead to significant harm.
Additional Notes 
Prompt format is defined in tokenizer_config.json. Can be used to deploy OpenAI-like API service using vllm.
Supported Languages 
zh (Full proficiency), en (Full proficiency), fr (Full proficiency), de (Full proficiency), ja (Full proficiency), ko (Full proficiency), it (Full proficiency), ru (Full proficiency), fi (Full proficiency)
Input Output 
Input Format:
Prompt format using special tokens like <|role|>, <|says|>, <|end|>
Accepted Modalities:
text
Output Format:
Generated text
Performance Tips:
Ensure usage of the fast tokenizer from transformers.
LLM NameOpenbuddy Yi1.5 34B V21.3 32K
Repository ๐Ÿค—https://huggingface.co/OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k 
Model Size34b
Required VRAM69.2 GB
Updated2024-11-21
MaintainerOpenBuddy
Model Typellama
Model Files  4.8 GB: 1-of-15   4.8 GB: 2-of-15   5.0 GB: 3-of-15   4.8 GB: 4-of-15   4.8 GB: 5-of-15   5.0 GB: 6-of-15   4.8 GB: 7-of-15   4.8 GB: 8-of-15   5.0 GB: 9-of-15   4.8 GB: 10-of-15   4.8 GB: 11-of-15   5.0 GB: 12-of-15   4.8 GB: 13-of-15   4.8 GB: 14-of-15   1.2 GB: 15-of-15
Supported Languageszh en fr de ja ko it ru fi
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length131072
Model Max Length131072
Transformers Version4.42.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<|pad0|>
Vocabulary Size65280
Torch Data Typebfloat16
Openbuddy Yi1.5 34B V21.3 32K (OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k)

Best Alternatives to Openbuddy Yi1.5 34B V21.3 32K

Best Alternatives
Context / RAM
Downloads
Likes
Bagel 34B V0.2195K / 68.7 GB1553139
34B Beta195K / 69.2 GB373062
Yi 34B 200K195K / 68.9 GB4698316
Bagel Hermes 34B Slerp195K / 68.9 GB39771
Smaug 34B V0.1195K / 69.2 GB320160
Yi 34B 200K AEZAKMI V2195K / 69.2 GB125412
Mergekit Slerp Anaazls195K / 69.2 GB370
Smaug 34B V0.1 ExPO195K / 69.2 GB28440
Faro Yi 34B195K / 69.2 GB36376
Bagel DPO 34B V0.5195K / 68.7 GB287317

Rank the Openbuddy Yi1.5 34B V21.3 32K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 38149 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241110