Model Type |
| |
Additional Notes |
|
LLM Name | Mia Astral 1.32B Conv |
Repository ๐ค | https://huggingface.co/S1mp1eXXX/Mia-astral-1.32B-Conv |
Model Size | 1b |
Required VRAM | 5.3 GB |
Updated | 2025-02-22 |
Maintainer | S1mp1eXXX |
Model Type | gpt_neo |
Model Files | |
Model Architecture | GPTNeoForCausalLM |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.41.2 |
Tokenizer Class | GPT2Tokenizer |
Padding Token | <|endoftext|> |
Vocabulary Size | 50257 |
Torch Data Type | float32 |
Activation Function | gelu_new |
Errors | replace |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
TinyStories 1M | 2K / 0 GB | 41420 | 44 |
Complexity 1B | 2K / 2.6 GB | 12 | 1 |
TinyStories 1M Ds | 2K / GB | 13 | 0 |
TinyStories 1M ONNX | 2K / GB | 37 | 1 |
TinyStories Instruct 1M | 2K / 0 GB | 841 | 3 |
...n Prompt Generator 1M Examples | 2K / 0.6 GB | 68 | 4 |
Skript 1M GPT Neo350m | 2K / 0 GB | 20 | 1 |
Skript 1M GPT Neo125m | 2K / 0.6 GB | 24 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐