StableBeluga2 by petals-team

 ยป  All LLMs  ยป  petals-team  ยป  StableBeluga2   URL Share it on

  Arxiv:2306.02707   Arxiv:2307.09288   Autotrain compatible Dataset:conceptofmind/cot subm... Dataset:conceptofmind/flan2021... Dataset:conceptofmind/niv2 sub... Dataset:conceptofmind/t0 submi...   En   Endpoints compatible   Llama   Region:us   Safetensors

StableBeluga2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
StableBeluga2 (petals-team/StableBeluga2)

StableBeluga2 Parameters and Internals

Model Type 
auto-regressive language model, text-generation
Additional Notes 
The model uses `bfloat16` precision.
Supported Languages 
en (high)
Training Details 
Data Sources:
conceptofmind/cot_submix_original, conceptofmind/flan2021_submix_original, conceptofmind/t0_submix_original, conceptofmind/niv2_submix_original
Methodology:
Fine-tuning on Llama2 70B; supervised fine-tuning with Orca-style dataset
Input Output 
Input Format:
### System: System prompt ### User: Your prompt here ### Assistant:
Accepted Modalities:
text
Output Format:
Text generated based on input.
LLM NameStableBeluga2
Repository ๐Ÿค—https://huggingface.co/petals-team/StableBeluga2 
Model Size69b
Required VRAM1.7 GB
Updated2024-12-14
Maintainerpetals-team
Model Typellama
Model Files  1.7 GB: 1-of-81   1.7 GB: 2-of-81   1.7 GB: 3-of-81   1.7 GB: 4-of-81   1.7 GB: 5-of-81   1.7 GB: 6-of-81   1.7 GB: 7-of-81   1.7 GB: 8-of-81   1.7 GB: 9-of-81   1.7 GB: 10-of-81   1.7 GB: 11-of-81   1.7 GB: 12-of-81   1.7 GB: 13-of-81   1.7 GB: 14-of-81   1.7 GB: 15-of-81   1.7 GB: 16-of-81   1.7 GB: 17-of-81   1.7 GB: 18-of-81   1.7 GB: 19-of-81   1.7 GB: 20-of-81   1.7 GB: 21-of-81   1.7 GB: 22-of-81   1.7 GB: 23-of-81   1.7 GB: 24-of-81   1.7 GB: 25-of-81   1.7 GB: 26-of-81   1.7 GB: 27-of-81   1.7 GB: 28-of-81   1.7 GB: 29-of-81   1.7 GB: 30-of-81   1.7 GB: 31-of-81   1.7 GB: 32-of-81   1.7 GB: 33-of-81   1.7 GB: 34-of-81   1.7 GB: 35-of-81   1.7 GB: 36-of-81   1.7 GB: 37-of-81   1.7 GB: 38-of-81   1.7 GB: 39-of-81   1.7 GB: 40-of-81   1.7 GB: 41-of-81   1.7 GB: 42-of-81   1.7 GB: 43-of-81
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.32.0.dev0
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to StableBeluga2

Best Alternatives
Context / RAM
Downloads
Likes
...ine Tu Ophtho 3 1e 05 Stanford4K / 138.7 GB50
V Alpha Tross4K / 138.7 GB9618
Uni TianYan V14K / 138 GB13180
Airoboros L2 C70b 3.1.24K / 138 GB12780
SaiLy Experiment V14K / 276 GB272

Rank the StableBeluga2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 39237 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124