Model Type |
| ||||||
Supported Languages |
| ||||||
Input Output |
|
LLM Name | Korean GPT Neox 125M |
Repository ๐ค | https://huggingface.co/cateto/korean-gpt-neox-125M |
Model Size | 125m |
Required VRAM | 0.4 GB |
Updated | 2025-02-22 |
Maintainer | cateto |
Model Type | gpt_neox |
Model Files | |
Supported Languages | ko |
Model Architecture | GPTNeoXForCausalLM |
License | cc-by-3.0 |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.28.1 |
Vocabulary Size | 52096 |
Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Pythia 125M Storywriter | 2K / 0.6 GB | 122 | 0 |
... 125M Response Full Static Sft | 2K / 0.7 GB | 152 | 1 |
Pythia 125M Static Sft | 2K / 0.7 GB | 7 | 1 |
Openchatgpt Neox 125M | 2K / 0.7 GB | 154 | 4 |
Taco | 2K / 0.7 GB | 9 | 1 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐