GPT Neo 125M by EleutherAI

 ยป  All LLMs  ยป  EleutherAI  ยป  GPT Neo 125M   URL Share it on

  Arxiv:2101.00027   Autotrain compatible   Dataset:eleutherai/pile   En   Endpoints compatible   Gpt neo   Jax   Pytorch   Region:us   Rust   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/EleutherAI/gpt-neo-125m 

GPT Neo 125M Benchmarks

GPT Neo 125M (EleutherAI/gpt-neo-125m)

GPT Neo 125M Parameters and Internals

Model Type 
text generation, causal-lm
Use Cases 
Areas:
Research, Commercial applications
Primary Use Cases:
Text generation from a prompt
Limitations:
May produce socially unacceptable text, Dataset contains profanity and abrasive language
Considerations:
Human curation of outputs is recommended.
Training Details 
Data Sources:
EleutherAI/pile
Data Volume:
300 billion tokens
Methodology:
masked autoregressive language model, using cross-entropy loss
Model Architecture:
Transformer model
Safety Evaluation 
Ethical Considerations:
Potential to produce socially unacceptable text.
Responsible Ai Considerations 
Mitigation Strategies:
Human curation or filtering of outputs is recommended.
Input Output 
Input Format:
String of text as a prompt
Accepted Modalities:
text
Output Format:
Generated text sequence
Performance Tips:
Using text-generation pipeline with options like 'do_sample=True' and 'min_length=20'.
LLM NameGPT Neo 125M
Repository ๐Ÿค—https://huggingface.co/EleutherAI/gpt-neo-125m 
Model Size125m
Required VRAM0.5 GB
Updated2025-02-05
MaintainerEleutherAI
Model Typegpt_neo
Model Files  0.5 GB   0.5 GB
Supported Languagesen
Model ArchitectureGPTNeoForCausalLM
Licensemit
Context Length2048
Model Max Length2048
Transformers Version4.9.0.dev0
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token<|endoftext|>
End of Sentence Token<|endoftext|>
Unk Token<|endoftext|>
Vocabulary Size50257
Activation Functiongelu_new
Errorsreplace

Best Alternatives to GPT Neo 125M

Best Alternatives
Context / RAM
Downloads
Likes
Neox 125m Storytelling2K / 0.5 GB1050
Epfl Cs 522 Istari Mcqa2K / 0.5 GB1050
GPT Neo Small2K / 0 GB1620
GPT Neo 125M Lama2K / 0.5 GB1080
GPT Neo Plantuml2K / 0.5 GB1150
GPT Neo 125M Sft2K / 0 GB50
GPT Neo Plantuml Sol12K / 0.5 GB1050
GPT Neo 125M Code Alpaca2K / 0 GB1110
Aitextgen2K / 0.5 GB50
Model2K / 0.2 GB1070
Note: green Score (e.g. "73.2") means that the model is better than EleutherAI/gpt-neo-125m.

Rank the GPT Neo 125M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227