Distilgpt2 by distilbert

 Β»  All LLMs  Β»  distilbert  Β»  Distilgpt2   URL Share it on

  Arxiv:1503.02531   Arxiv:1910.01108   Arxiv:1910.09700   Arxiv:2201.08542   Arxiv:2203.12574   Autotrain compatible   Co2 eq emissions   Coreml   Dataset:openwebtext   En   Endpoints compatible   Exbert   Gpt2   Jax   Model-index   Pytorch   Region:us   Rust   Safetensors   Tf   Tflite
Model Card on HF πŸ€—: https://huggingface.co/distilbert/distilgpt2 

Distilgpt2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Distilgpt2 (distilbert/distilgpt2)

Distilgpt2 Parameters and Internals

Model Type 
Transformer-based Language Model, Text Generation
Use Cases 
Areas:
Research applications, Commercial applications
Applications:
Writing assistance, Creative writing and art, Entertainment
Primary Use Cases:
Text generation
Limitations:
Bias related to race and gender, Not suitable for applications where factual correctness is required
Considerations:
Careful consideration of the bias and context of usage. Not recommended for direct human interaction without thorough bias assessment.
Additional Notes 
DistilGPT2 is designed to facilitate faster, resource-efficient text generation applications.
Supported Languages 
English (native)
Training Details 
Data Sources:
OpenWebTextCorpus (an open-source reproduction of OpenAI’s WebText dataset)
Methodology:
Knowledge distillation
Hardware Used:
8 16GB V100 GPUs
Model Architecture:
Transformer
Responsible Ai Considerations 
Fairness:
DistilGPT2 suffers from persistent bias issues similar to those described for GPT-2. The distilled versions have shown reductions in toxicity and bias compared to their teacher models, yet still present statistically significant bias.
Mitigation Strategies:
Ongoing research to mitigate bias using additional techniques for distilled models.
Input Output 
Input Format:
Text input for generating text
Accepted Modalities:
text
Output Format:
Generated text
Performance Tips:
Use of seed setting for reproducibility
LLM NameDistilgpt2
Repository πŸ€—https://huggingface.co/distilbert/distilgpt2 
Model Size88.2m
Required VRAM0.4 GB
Updated2024-12-22
Maintainerdistilbert
Model Typegpt2
Model Files  0.4 GB   0.4 GB
Supported Languagesen
Model ArchitectureGPT2LMHeadModel
Licenseapache-2.0
Model Max Length1024
Vocabulary Size50257
Activation Functiongelu_new

Best Alternatives to Distilgpt2

Best Alternatives
Context / RAM
Downloads
Likes
Leo DistilGPT Medical0K / 0.4 GB270
Distilgpt2 HC30K / 0.3 GB12991
Distilgpt2 Emailgen V20K / 0.3 GB13175
Distilgpt2 Emailgen0K / 0.4 GB5954
Distilgpt2 Nepali0K / 0.3 GB2587
Note: green Score (e.g. "73.2") means that the model is better than distilbert/distilgpt2.

Rank the Distilgpt2 Capabilities

πŸ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217