MN Maghin 12B by rityak

 ยป  All LLMs  ยป  rityak  ยป  MN Maghin 12B   URL Share it on

  Merged Model   Arxiv:2306.01708   Arxiv:2311.03099   Autotrain compatible Base model:anthracite-org/magn... Base model:cognitivecomputatio... Base model:elinas/chronos-gold... Base model:nbeerbower/lyra-gut...   Conversational   Endpoints compatible   Mistral   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/rityak/MN-Maghin-12B 

MN Maghin 12B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
MN Maghin 12B (rityak/MN-Maghin-12B)

MN Maghin 12B Parameters and Internals

Additional Notes 
This model is a merge of several pre-trained language models using the DARE TIES merge method with specific parameters.
LLM NameMN Maghin 12B
Repository ๐Ÿค—https://huggingface.co/rityak/MN-Maghin-12B 
Base Model(s)  elinas/Chronos-Gold-12B-1.0   ...yra Gutenberg Mistral Nemo 12B   cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b   anthracite-org/magnum-v2.5-12b-kto   elinas/Chronos-Gold-12B-1.0   nbeerbower/Lyra-Gutenberg-mistral-nemo-12B   cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b   anthracite-org/magnum-v2.5-12b-kto
Merged ModelYes
Model Size12b
Required VRAM24.5 GB
Updated2025-02-05
Maintainerrityak
Model Typemistral
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Model ArchitectureMistralForCausalLM
Context Length1024000
Model Max Length1024000
Transformers Version4.44.1
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad>
Vocabulary Size131072
Torch Data Typefloat16

Best Alternatives to MN Maghin 12B

Best Alternatives
Context / RAM
Downloads
Likes
...r Nemo 12B Instruct R 21 09 241000K / 24.5 GB8449106
...s PersonalityEngine V1.1.0 12B1000K / 24.5 GB49229
Captain Eris Violet V0.420 12B1000K / 24.5 GB106923
Mistral Nemo Kartoffel 12B1000K / 24.5 GB1833
Saiga Nemo 12b1000K / 24.5 GB36481337
MN 12B Mimicore GreenSnake1000K / 24.5 GB832
MN 12B Mimicore WhiteSnake1000K / 24.5 GB613
MN 12B Mag Mell R11000K / 24.5 GB424699
SauerkrautLM Nemo 12B Instruct1000K / 24.5 GB1952722
MN 12B Mimicore Orochi1000K / 24.5 GB312
Note: green Score (e.g. "73.2") means that the model is better than rityak/MN-Maghin-12B.

Rank the MN Maghin 12B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227