MN 12B Siskin V0.1 by Nohobby

 ยป  All LLMs  ยป  Nohobby  ยป  MN 12B Siskin V0.1   URL Share it on

  Merged Model   Autotrain compatible Base model:arliai/mistral-nemo... Base model:nbeerbower/lyra-gut... Base model:neversleep/lumimaid... Base model:v000000/nm-12b-lyri...   Conversational   Endpoints compatible   Mistral   Region:us   Safetensors   Sharded   Tensorflow

MN 12B Siskin V0.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
MN 12B Siskin V0.1 (Nohobby/MN-12B-Siskin-v0.1)

MN 12B Siskin V0.1 Parameters and Internals

Additional Notes 
Somewhat experimental merge of some Nemo models, consisting mostly of human data, with about ~28k context length which might be stretched further.
Input Output 
Input Format:
~~[INST] {input} [/INST] {output}~~
Performance Tips:
Specify the narrative style somewhere in the prompt to avoid unwanted first-person writing.
LLM NameMN 12B Siskin V0.1
Repository ๐Ÿค—https://huggingface.co/Nohobby/MN-12B-Siskin-v0.1 
Base Model(s)  ...yra Gutenberg Mistral Nemo 12B   NeverSleep/Lumimaid-v0.2-12B   ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.1   NM 12B Lyris Dev 3   nbeerbower/Lyra-Gutenberg-mistral-nemo-12B   NeverSleep/Lumimaid-v0.2-12B   ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.1   v000000/NM-12B-Lyris-dev-3
Merged ModelYes
Model Size12b
Required VRAM24.5 GB
Updated2025-02-22
MaintainerNohobby
Model Typemistral
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Model ArchitectureMistralForCausalLM
Context Length1024000
Model Max Length1024000
Transformers Version4.44.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad>
Vocabulary Size131072
Torch Data Typebfloat16

Best Alternatives to MN 12B Siskin V0.1

Best Alternatives
Context / RAM
Downloads
Likes
Captain Eris Violet V0.420 12B1000K / 24.5 GB36872229
...r Nemo 12B Instruct R 21 09 241000K / 24.5 GB8787111
...s PersonalityEngine V1.1.0 12B1000K / 24.5 GB55732
Mistral Nemo Kartoffel 12B1000K / 24.5 GB3114
MN 12B Mag Mell R11000K / 24.5 GB6336115
MN 12B Mimicore GreenSnake1000K / 24.5 GB1682
SauerkrautLM Nemo 12B Instruct1000K / 24.5 GB2272122
Saiga Nemo 12b1000K / 24.5 GB25332239
MN 12B Mimicore WhiteSnake1000K / 24.5 GB883
Dolphin 2.9.3 Mistral Nemo 12B1000K / 24.5 GB725799
Note: green Score (e.g. "73.2") means that the model is better than Nohobby/MN-12B-Siskin-v0.1.

Rank the MN 12B Siskin V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227