LLM Name | Pythia 12B Pre 2000 |
Repository | Open on ๐ค |
Model Size | 12b |
Required VRAM | 23.8 GB |
Updated | 2024-07-26 |
Maintainer | andreaskoepf |
Model Type | gpt_neox |
Model Files | |
Model Architecture | GPTNeoXForCausalLM |
License | apache-2.0 |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.26.1 |
Tokenizer Class | GPTNeoXTokenizer |
Vocabulary Size | 50288 |
Torch Data Type | float16 |
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
Dolly V2 12B | 0.3 | 2K / 23.8 GB | 4155 | 1945 |
...sst Sft 4 Pythia 12B Epoch 3.5 | 0.3 | 2K / 23.8 GB | 38842 | 357 |
Pythia 12B | 0.2 | 2K / 23.8 GB | 6976 | 129 |
Oasst Sft 1 Pythia 12B | 0.2 | 2K / 23.8 GB | 2237 | 279 |
H2ogpt Oasst1 512 12B | 0.2 | 2K / 23.9 GB | 3418 | 27 |
Pythia 12B Deduped | 0.2 | 2K / 23.8 GB | 7425 | 52 |
Pythia 12B Sft V8 7K Steps | 0.2 | 2K / 23.8 GB | 1417 | 21 |
...ythia 12B Sft V8 Rlhf 2K Steps | 0.2 | 2K / 23.8 GB | 954 | 0 |
Pythia 12B Pre V8.12.5K Steps | 0.2 | 2K / 23.8 GB | 954 | 6 |
H2ogpt Gm Oasst1 En 1024 12B | 0.2 | 2K / 23.8 GB | 944 | 5 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐