Model Type |
| ||||||
Supported Languages |
| ||||||
Training Details |
|
LLM Name | Pythia 410M Roberta Lr 8e7 Kl 01 Steps 12000 Rlhf Model |
Repository ๐ค | https://huggingface.co/jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model |
Model Size | 410m |
Required VRAM | 1.6 GB |
Updated | 2025-02-05 |
Maintainer | jaredjoss |
Model Type | gpt_neox |
Model Files | |
Supported Languages | en |
Model Architecture | GPTNeoXForCausalLM |
License | mit |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.37.0 |
Tokenizer Class | GPTNeoXTokenizer |
Padding Token | <|padding|> |
Vocabulary Size | 50304 |
Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Pythia410m Sft Tldr | 2K / 1.6 GB | 4549 | 0 |
Pythia 410M Sft Full | 2K / 0.8 GB | 137 | 0 |
Pythia 410M | 2K / 0.9 GB | 74690 | 22 |
Healix 410M | 2K / 1.6 GB | 1274 | 0 |
Pythia 410M Ludii Sft | 2K / 1.6 GB | 142 | 0 |
Pythia 410M Deduped SimPOW 0 | 2K / 0.8 GB | 6 | 0 |
Pythia 410M Orpo | 2K / 1.6 GB | 5 | 0 |
... Llm Pythia 410M Pm Gen Ian Nd | 2K / 1.6 GB | 134 | 0 |
Outputs3 | 2K / 0.8 GB | 127 | 0 |
Pythia 410m Adpater Lora Mrpc | 2K / 1.6 GB | 9 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐