Model Type |
| ||||||
Use Cases |
| ||||||
Additional Notes |
| ||||||
Supported Languages |
| ||||||
Training Details |
| ||||||
Input Output |
|
LLM Name | Pantheon RP Pure 1.6.2 22B Small |
Repository ๐ค | https://huggingface.co/Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small |
Base Model(s) | |
Model Size | 22b |
Required VRAM | 44.7 GB |
Updated | 2025-02-05 |
Maintainer | Gryphe |
Model Type | mistral |
Instruction-Based | Yes |
Model Files | |
Supported Languages | en |
Model Architecture | MistralForCausalLM |
License | other |
Context Length | 131072 |
Model Max Length | 131072 |
Transformers Version | 4.45.0.dev0 |
Tokenizer Class | LlamaTokenizer |
Padding Token | [control_748] |
Vocabulary Size | 32768 |
Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
MS Schisandra 22B V0.2 | 128K / 44.7 GB | 18 | 8 |
MS Meadowlark 22B | 128K / 44.7 GB | 119 | 12 |
... V4x1.6.2RP Cydonia VXXX 22B 6 | 128K / 44.7 GB | 95 | 2 |
MS Fujin 2409 22B | 128K / 44.7 GB | 57 | 0 |
MS Dampf 2409 22B | 128K / 44.7 GB | 56 | 0 |
MS Moingooistral 2409 22B | 128K / 44.7 GB | 56 | 0 |
MS A Coolyte 2409 22B | 128K / 44.7 GB | 54 | 0 |
Beeper King 22B | 128K / 44.7 GB | 31 | 5 |
MS Quadrosiac 2409 22B | 128K / 44.7 GB | 22 | 0 |
MS Physician 2409 22B | 128K / 44.7 GB | 15 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐