Archangel Sft Pythia6 9B by ContextualAI

 ยป  All LLMs  ยป  ContextualAI  ยป  Archangel Sft Pythia6 9B   URL Share it on

  Alignment   Autotrain compatible   Dataset:anthropic/hh-rlhf   Dataset:openassistant/oasst1   Dataset:stanfordnlp/shp   Dpo   En   Endpoints compatible   Gpt neox   Halo   Halos   Human feedback   Preferences   Region:us   Rl   Rlhf   Safetensors   Sharded   Tensorflow

Archangel Sft Pythia6 9B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Archangel Sft Pythia6 9B (ContextualAI/archangel_sft_pythia6-9b)

Archangel Sft Pythia6 9B Parameters and Internals

Model Type 
text generation
Additional Notes 
To prompt Archangel models, ensure consistent formatting with TuluV2. The tokenizer includes '<|good|>' and '<|bad|>' for contextual control tokens.
Supported Languages 
en (fluent)
Training Details 
Data Sources:
stanfordnlp/SHP, Anthropic/hh-rlhf, OpenAssistant/oasst1
Methodology:
SFT (Supervised Fine-Tuning), RLHF (Reinforcement Learning with Human Feedback)
Input Output 
Input Format:
Formatted input with roles <|user|> and <|assistant|>
Accepted Modalities:
text
Output Format:
Text-based response
LLM NameArchangel Sft Pythia6 9B
Repository ๐Ÿค—https://huggingface.co/ContextualAI/archangel_sft_pythia6-9b 
Model Size9b
Required VRAM13.8 GB
Updated2025-02-22
MaintainerContextualAI
Model Typegpt_neox
Model Files  5.0 GB: 1-of-3   5.0 GB: 2-of-3   3.8 GB: 3-of-3
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.35.2
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|endoftext|>
Vocabulary Size50432
Torch Data Typebfloat16

Best Alternatives to Archangel Sft Pythia6 9B

Best Alternatives
Context / RAM
Downloads
Likes
Modello Italia 9B4K / 34 GB281919
Modello Italia 9B Bf164K / 17.1 GB280110
...Italia 9B Autoround W4g128 Gpu4K / 5.2 GB780
...Italia 9B Autoround W4g128 Cpu4K / 5.2 GB840
Modello Italia 9B Hf4K / 17.1 GB422
H2ogpt Oig Oasst1 512 6 9b2K / 13.8 GB223118
H2ogpt Oig Oasst1 256 6 9b2K / 13.8 GB19945
Hi NOLIN 9B2K / 19.5 GB736
Note: green Score (e.g. "73.2") means that the model is better than ContextualAI/archangel_sft_pythia6-9b.

Rank the Archangel Sft Pythia6 9B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227