Pythia 410M by EleutherAI

 ยป  All LLMs  ยป  EleutherAI  ยป  Pythia 410M   URL Share it on

  Arxiv:2101.00027   Arxiv:2201.07311   Arxiv:2304.01373   Autotrain compatible   Dataset:eleutherai/pile   En   Endpoints compatible   Gpt neox   Pythia   Pytorch   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/EleutherAI/pythia-410m 

Pythia 410M Benchmarks

Pythia 410M (EleutherAI/pythia-410m)

Pythia 410M Parameters and Internals

Model Type 
Transformer-based Language Model
Use Cases 
Areas:
Research
Primary Use Cases:
Behavior and functionality research of large language models
Limitations:
Not suitable for human-facing deployment, translation or generating text in other languages
Considerations:
Conduct risk and bias assessments when using in downstream applications.
Additional Notes 
Pythia-410M is not tuned for downstream applications like commercial chatbots.
Supported Languages 
en (Primary language - English)
Training Details 
Data Sources:
The Pile
Data Volume:
299,892,736,000 tokens
Methodology:
Trained with uniform batch size of 2M tokens. Used Flash Attention. Learning rate schedule decayed to a minimum of 0.1ร— maximum LR.
Training Time:
143000 steps at a batch size of 2M
Model Architecture:
Transformer-based
Responsible Ai Considerations 
Fairness:
Biases regarding gender, religion, and race documented in Section 6 of the Pile paper.
Transparency:
Model outputs should not be relied upon for factual accuracy.
Accountability:
Users responsible for evaluating and informing audiences about generated outputs.
Mitigation Strategies:
Implement risk and bias assessments when using in downstream applications.
Input Output 
Input Format:
Text string
Accepted Modalities:
text
Output Format:
Text string
Performance Tips:
Always evaluate the outputs for factual accuracy and potential biases.
Release Notes 
Date:
January 2023
Notes:
Pythia models were renamed and parameter counts adjusted for clarity.
Version:
Pythia v0
Notes:
Early version with hyperparameter discrepancies.
LLM NamePythia 410M
Repository ๐Ÿค—https://huggingface.co/EleutherAI/pythia-410m 
Model Size410m
Required VRAM0.9 GB
Updated2025-02-05
MaintainerEleutherAI
Model Typegpt_neox
Model Files  0.9 GB   0.9 GB
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.24.0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50304
Torch Data Typefloat16

Best Alternatives to Pythia 410M

Best Alternatives
Context / RAM
Downloads
Likes
Pythia410m Sft Tldr2K / 1.6 GB45490
Pythia 410M Sft Full2K / 0.8 GB1370
Healix 410M2K / 1.6 GB12740
Pythia 410M Ludii Sft2K / 1.6 GB1420
Pythia 410M Deduped SimPOW 02K / 0.8 GB60
Pythia 410M Orpo2K / 1.6 GB50
... Llm Pythia 410M Pm Gen Ian Nd2K / 1.6 GB1340
...7 Kl 01 Steps 12000 Rlhf Model2K / 1.6 GB140
Outputs32K / 0.8 GB1270
Pythia 410m Adpater Lora Mrpc2K / 1.6 GB90
Note: green Score (e.g. "73.2") means that the model is better than EleutherAI/pythia-410m.

Rank the Pythia 410M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227