LLM Name | Pythia 12B Pre V8.12.5K Steps |
Repository ๐ค | https://huggingface.co/OpenAssistant/pythia-12b-pre-v8-12.5k-steps |
Model Size | 12b |
Required VRAM | 23.8 GB |
Updated | 2024-10-07 |
Maintainer | OpenAssistant |
Model Type | gpt_neox |
Model Files | |
Model Architecture | GPTNeoXForCausalLM |
License | apache-2.0 |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.28.0.dev0 |
Tokenizer Class | GPTNeoXTokenizer |
Vocabulary Size | 50288 |
Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Dolly V2 12B | 2K / 23.8 GB | 3568 | 1946 |
...sst Sft 4 Pythia 12B Epoch 3.5 | 2K / 23.8 GB | 258238 | 360 |
Pythia 12B | 2K / 23.8 GB | 6244 | 131 |
Oasst Sft 1 Pythia 12B | 2K / 23.8 GB | 3385 | 279 |
Pythia 12B Deduped | 2K / 23.8 GB | 12840 | 51 |
Pythia 12B Sft V8 7K Steps | 2K / 23.8 GB | 1578 | 21 |
...ythia 12B Sft V8 Rlhf 2K Steps | 2K / 23.8 GB | 1008 | 0 |
H2ogpt Gm Oasst1 En 1024 12B | 2K / 23.8 GB | 932 | 5 |
Pythia 12B Sft V8.2.5K Steps | 2K / 23.8 GB | 977 | 0 |
H2ogpt Oasst1 512 12B | 2K / 23.9 GB | 1066 | 27 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐