Pythia 12B by EleutherAI

 ยป  All LLMs  ยป  EleutherAI  ยป  Pythia 12B   URL Share it on

  Arxiv:2101.00027   Arxiv:2201.07311   Arxiv:2304.01373   Autotrain compatible   Dataset:eleutherai/pile   En   Endpoints compatible   Gpt neox   Pythia   Pytorch   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/EleutherAI/pythia-12b 

Pythia 12B Benchmarks

Pythia 12B (EleutherAI/pythia-12b)

Pythia 12B Parameters and Internals

Model Type 
Transformer-based Language Model
Use Cases 
Areas:
Research
Applications:
Scientific experiments
Primary Use Cases:
Interpretability Research
Limitations:
Not intended for deployment, English-language only, Not suitable for translation or generating text in other languages
Considerations:
Potential for generating harmful or offensive text
Additional Notes 
Pythia models are intended to facilitate interpretability research. They are not suitable as standalone products for human interactions.
Supported Languages 
English (fluent)
Training Details 
Data Sources:
EleutherAI/pile
Data Volume:
299,892,736,000 tokens
Methodology:
Training for 143000 steps, batch size 2M tokens
Model Architecture:
Transformer-based
Input Output 
Input Format:
String of text
Accepted Modalities:
Text
Output Format:
String of text
Release Notes 
Version:
Current release
Date:
January 2023
Notes:
Renamed Pythia models and retrained addressing hyperparameter discrepancies. Uniform batch size for all models.
LLM NamePythia 12B
Repository ๐Ÿค—https://huggingface.co/EleutherAI/pythia-12b 
Model Size12b
Required VRAM23.8 GB
Updated2024-12-21
MaintainerEleutherAI
Model Typegpt_neox
Model Files  9.8 GB: 1-of-3   9.9 GB: 2-of-3   4.1 GB: 3-of-3   9.8 GB: 1-of-3   9.9 GB: 2-of-3   4.1 GB: 3-of-3
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.24.0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50688
Torch Data Typefloat16

Best Alternatives to Pythia 12B

Best Alternatives
Context / RAM
Downloads
Likes
...sst Sft 4 Pythia 12B Epoch 3.52K / 23.8 GB498162360
Dolly V2 12B2K / 23.8 GB25461950
Oasst Sft 1 Pythia 12B2K / 23.8 GB36673278
Pythia 12B Deduped2K / 23.8 GB776751
Pythia 12B Sft V8 7K Steps2K / 23.8 GB151221
H2ogpt Oasst1 512 12B2K / 23.9 GB161527
...ythia 12B Sft V8 Rlhf 2K Steps2K / 23.8 GB12400
Pythia 12B Pre V8.12.5K Steps2K / 23.8 GB12516
H2ogpt Gm Oasst1 En 1024 12B2K / 23.8 GB12585
Pythia 12B Sft V8.2.5K Steps2K / 23.8 GB12490
Note: green Score (e.g. "73.2") means that the model is better than EleutherAI/pythia-12b.

Rank the Pythia 12B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217