LLM Name | Finetuned Gptneo Base Tinystories Ta V3 |
Repository ๐ค | https://huggingface.co/tniranjan/finetuned_gptneo-base-tinystories-ta_v3 |
Base Model(s) | |
Model Size | 153m |
Required VRAM | 0.6 GB |
Updated | 2025-03-21 |
Maintainer | tniranjan |
Model Type | gpt_neo |
Model Files | |
Model Architecture | GPTNeoForCausalLM |
Context Length | 1024 |
Model Max Length | 1024 |
Transformers Version | 4.49.0 |
Tokenizer Class | LlamaTokenizer |
Padding Token | </s> |
Vocabulary Size | 49992 |
Torch Data Type | float32 |
Activation Function | gelu_new |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐