LLM Name | Moist Theia 21B |
Repository ๐ค | https://huggingface.co/mergekit-community/Moist_Theia_21B |
Base Model(s) | |
Merged Model | Yes |
Model Size | 21b |
Required VRAM | 40.8 GB |
Updated | 2024-10-04 |
Maintainer | mergekit-community |
Model Type | mistral |
Model Files | |
Model Architecture | MistralForCausalLM |
Context Length | 1024000 |
Model Max Length | 1024000 |
Transformers Version | 4.44.1 |
Tokenizer Class | PreTrainedTokenizerFast |
Padding Token | <pad> |
Vocabulary Size | 131072 |
Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Theia 21B V2 | 1000K / 40.8 GB | 248 | 25 |
Theia 21B V1 | 1000K / 40.8 GB | 112 | 27 |
NeMoist 21B V1a | 1000K / 40.8 GB | 73 | 2 |
NeMoria 21B | 1000K / 40.9 GB | 23 | 12 |
Blendy001 | 1000K / 40.8 GB | 12 | 0 |
Theia 21B V1 Pretrained | 1000K / 40.8 GB | 6 | 0 |
NeMoist 21B V0.5 | 1000K / 40.8 GB | 5 | 0 |
Codestral 21B Pruned | 32K / 43.1 GB | 8 | 2 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐