LLM Name | NemoDori V0.2 12B MN BT |
Repository ๐ค | https://huggingface.co/RozGrov/NemoDori-v0.2-12B-MN-BT |
Base Model(s) | |
Merged Model | Yes |
Model Size | 12b |
Required VRAM | 24.5 GB |
Updated | 2024-10-12 |
Maintainer | RozGrov |
Model Type | mistral |
Instruction-Based | Yes |
Model Files | |
Model Architecture | MistralForCausalLM |
Context Length | 1024000 |
Model Max Length | 1024000 |
Transformers Version | 4.44.0 |
Tokenizer Class | GPT2Tokenizer |
Padding Token | <pad> |
Vocabulary Size | 131072 |
Torch Data Type | float16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
...r Nemo 12B Instruct R 21 09 24 | 1000K / 24.5 GB | 23457 | 59 |
SauerkrautLM Nemo 12B Instruct | 1000K / 24.5 GB | 5503 | 20 |
MN 12B Celeste V1.9 | 1000K / 24.5 GB | 846 | 107 |
...ral Nemo 12B ArliAI RPMax V1.1 | 1000K / 24.5 GB | 899 | 38 |
Mistral Nemo Wissenschaft 12B | 1000K / 24.5 GB | 2243 | 3 |
Magnum Instruct DPO 12B | 1000K / 24.5 GB | 446 | 11 |
...tral Nemo Gutenberg Doppel 12B | 1000K / 24.5 GB | 58 | 2 |
ChatWaifu V1.4 | 1000K / 24.5 GB | 75 | 10 |
OmniLing V1 12B Experimental | 1000K / 24.5 GB | 76 | 5 |
Nemo 12B Marlin V8 | 1000K / 24.5 GB | 373 | 2 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐