Distilgpt2 Emailgen by postbot

 ยป  All LLMs  ยป  postbot  ยป  Distilgpt2 Emailgen   URL Share it on

  Autotrain compatible Base model:distilbert/distilgp... Base model:finetune:distilbert...   Dataset:aeslc   Dataset:postbot/multi emails   Distilgpt2   Email   Email generation   Endpoints compatible   Generated from trainer   Gpt2   Pytorch   Region:us   Safetensors

Distilgpt2 Emailgen Benchmarks

Distilgpt2 Emailgen (postbot/distilgpt2-emailgen)

Distilgpt2 Emailgen Parameters and Internals

Model Type 
text generation
Use Cases 
Applications:
email generation
Primary Use Cases:
email autocomplete suggestions
Limitations:
The model should not be used to write entire emails without input., Verify suggestions for false claims and negation statements.
Additional Notes 
The model is fine-tuned on a dataset of 50k emails.
Training Details 
Data Sources:
aeslc, postbot/multi_emails
Methodology:
Fine-tuned
Input Output 
Performance Tips:
Model results may vary significantly with different formatting.
LLM NameDistilgpt2 Emailgen
Repository ๐Ÿค—https://huggingface.co/postbot/distilgpt2-emailgen 
Base Model(s)  distilgpt2   /distilgpt2
Model Size88.2m
Required VRAM0.4 GB
Updated2024-12-22
Maintainerpostbot
Model Typegpt2
Model Files  0.4 GB   0.4 GB   0.0 GB
Model ArchitectureGPT2LMHeadModel
Licenseapache-2.0
Model Max Length1024
Transformers Version4.21.1
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token<|endoftext|>
End of Sentence Token<|endoftext|>
Unk Token<|endoftext|>
Vocabulary Size50257
Torch Data Typefloat32
Activation Functiongelu_new
Errorsreplace

Best Alternatives to Distilgpt2 Emailgen

Best Alternatives
Context / RAM
Downloads
Likes
Leo DistilGPT Medical0K / 0.4 GB270
Distilgpt2 HC30K / 0.3 GB12991
Distilgpt20K / 0.4 GB1416731457
Distilgpt2 Emailgen V20K / 0.3 GB13175
Distilgpt2 Nepali0K / 0.3 GB2587
Note: green Score (e.g. "73.2") means that the model is better than postbot/distilgpt2-emailgen.

Rank the Distilgpt2 Emailgen Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217