NemoDori V0.2 12B MN BT by RozGrov

 ยป  All LLMs  ยป  RozGrov  ยป  NemoDori V0.2 12B MN BT   URL Share it on

  Merged Model   Autotrain compatible Base model:crestf411/nemo-sunf... Base model:rozgrov/nemodori-v0... Base model:unsloth/mistral-nem...   Conversational   Crestf411/nemo-sunfall-v0.6.1   Endpoints compatible   Instruct   Mistral   Region:us   Rozgrov/nemodori-v0.1-12b-ms   Safetensors   Sharded   Tensorflow Unsloth/mistral-nemo-instruct-...

NemoDori V0.2 12B MN BT Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
NemoDori V0.2 12B MN BT (RozGrov/NemoDori-v0.2-12B-MN-BT)

NemoDori V0.2 12B MN BT Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
text generation, role-play, ERP modeling
Applications:
Talemate interactions
Primary Use Cases:
Conversations, Character attribute generation, World state narration
Limitations:
Narration may include unintended script, Improving instruction precision can aid generation quality
Considerations:
Best with conversation-based tasks with Talemate.
Additional Notes 
Experimental build with successful integration in platforms like Talemate.
Training Details 
Methodology:
Merged using LazyMergekit
Input Output 
Accepted Modalities:
text
Output Format:
text
Performance Tips:
Adjusting Talemate presets may improve results.
Release Notes 
Version:
0.2
Notes:
Merged model with enhanced capability for conversations and instructions following.
LLM NameNemoDori V0.2 12B MN BT
Repository ๐Ÿค—https://huggingface.co/RozGrov/NemoDori-v0.2-12B-MN-BT 
Base Model(s)  crestf411/nemo-sunfall-v0.6.1   Mistral Nemo Instruct 2407   RozGrov/NemoDori-v0.1-12B-MS   crestf411/nemo-sunfall-v0.6.1   unsloth/Mistral-Nemo-Instruct-2407   RozGrov/NemoDori-v0.1-12B-MS
Merged ModelYes
Model Size12b
Required VRAM24.5 GB
Updated2024-12-21
MaintainerRozGrov
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Model ArchitectureMistralForCausalLM
Context Length1024000
Model Max Length1024000
Transformers Version4.44.0
Tokenizer ClassGPT2Tokenizer
Padding Token<pad>
Vocabulary Size131072
Torch Data Typefloat16

Best Alternatives to NemoDori V0.2 12B MN BT

Best Alternatives
Context / RAM
Downloads
Likes
...r Nemo 12B Instruct R 21 09 241000K / 24.5 GB2128796
SauerkrautLM Nemo 12B Instruct1000K / 24.5 GB10688822
Magnum V4 12B1000K / 24.5 GB53437
Mistral Nemo Wissenschaft 12B1000K / 24.5 GB59816
ChatWaifu V1.41000K / 24.5 GB6514
SAINEMO ReMIX1000K / 24.5 GB24412
Mistral Nemo Bophades 12B1000K / 24.5 GB1058
...tral Nemo Gutenberg Doppel 12B1000K / 24.5 GB364
NekoMix 12B1000K / 24.5 GB1126
...stral Nemo Gutenberg2 12B Test1000K / 24.5 GB321
Note: green Score (e.g. "73.2") means that the model is better than RozGrov/NemoDori-v0.2-12B-MN-BT.

Rank the NemoDori V0.2 12B MN BT Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217