Bilingual GPT Neox 4B Instruction Ppo by rinna

 Β»  All LLMs  Β»  rinna  Β»  Bilingual GPT Neox 4B Instruction Ppo   URL Share it on

  Arxiv:1707.06347   Arxiv:2203.02155   Arxiv:2404.01657   Autotrain compatible Base model:finetune:rinna/bili... Base model:rinna/bilingual-gpt...   Dataset:anthropic/hh-rlhf   En   Gpt neox   Instruct   Ja   Pytorch   Region:us   Safetensors

Bilingual GPT Neox 4B Instruction Ppo Benchmarks

Bilingual GPT Neox 4B Instruction Ppo (rinna/bilingual-gpt-neox-4b-instruction-ppo)

Bilingual GPT Neox 4B Instruction Ppo Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
research, commercial applications
Applications:
instruction-following conversational agent
Primary Use Cases:
Bilingual text generation
Limitations:
Sensitive to decoding hyper-parameters.
Considerations:
Decoding hyper-parameters should be carefully chosen.
Additional Notes 
The model uses a sentencepiece-based tokenizer with a vocabulary size of 65,536.
Supported Languages 
ja (full proficiency), en (full proficiency)
Training Details 
Data Sources:
Anthropic/hh-rlhf
Methodology:
Supervised Fine-Tuning (SFT) and PPO-based Reinforcement Learning (RL)
Model Architecture:
36-layer, 2816-hidden-size transformer-based language model
Input Output 
Input Format:
A special format for conversation between 'ユーアー' and 'システム', ending with 'システム: '.
Accepted Modalities:
text
Output Format:
Textual response in the set language (Japanese/English)
Performance Tips:
Adjust decoding hyper-parameters for optimal performance.
LLM NameBilingual GPT Neox 4B Instruction Ppo
Repository πŸ€—https://huggingface.co/rinna/bilingual-gpt-neox-4b-instruction-ppo 
Base Model(s)  Bilingual GPT Neox 4B   rinna/bilingual-gpt-neox-4b
Model Size4b
Required VRAM7.7 GB
Updated2025-02-22
Maintainerrinna
Model Typegpt_neox
Instruction-BasedYes
Model Files  7.7 GB   7.8 GB
Supported Languagesja en
Model ArchitectureGPTNeoXForCausalLM
Licensemit
Context Length2048
Model Max Length2048
Tokenizer ClassT5Tokenizer
Padding Token[PAD]
Vocabulary Size65536
Torch Data Typefloat16

Best Alternatives to Bilingual GPT Neox 4B Instruction Ppo

Best Alternatives
Context / RAM
Downloads
Likes
Tora 4B2K / 7.6 GB762
...x 4B Instruction Sft En Ja 84K2K / 7.6 GB891
...al GPT Neox 4B Instruction Sft2K / 7.6 GB36518

Rank the Bilingual GPT Neox 4B Instruction Ppo Capabilities

πŸ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227