Arcanum 12B by Xclbr7

 ยป  All LLMs  ยป  Xclbr7  ยป  Arcanum 12B   URL Share it on

  Merged Model   Autotrain compatible   Endpoints compatible   Mistral   Model-index   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/Xclbr7/Arcanum-12b 

Arcanum 12B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Arcanum 12B (Xclbr7/Arcanum-12b)

Arcanum 12B Parameters and Internals

Model Type 
Causal Language Model
Use Cases 
Primary Use Cases:
Conversation with different personas
Additional Notes 
Arcanum-12b is a merged large language model combining TheDrummer/Rocinante-12B-v1.1 and MarinaraSpaghetti/NemoMix-Unleashed-12B.
Supported Languages 
English (primarily)
Training Details 
Methodology:
The model was created by merging two existing 12B models using a novel merging technique.
Model Architecture:
Transformer-based language model
Responsible Ai Considerations 
Mitigation Strategies:
Model may inherit biases and limitations from parent models. Users should be aware of potential biases.
LLM NameArcanum 12B
Repository ๐Ÿค—https://huggingface.co/Xclbr7/Arcanum-12b 
Merged ModelYes
Model Size12b
Required VRAM24.5 GB
Updated2025-02-05
MaintainerXclbr7
Model Typemistral
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Model ArchitectureMistralForCausalLM
Licensemit
Context Length1024000
Model Max Length1024000
Transformers Version4.44.2
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size131072
Torch Data Typefloat16

Best Alternatives to Arcanum 12B

Best Alternatives
Context / RAM
Downloads
Likes
...r Nemo 12B Instruct R 21 09 241000K / 24.5 GB8449106
...s PersonalityEngine V1.1.0 12B1000K / 24.5 GB49229
Captain Eris Violet V0.420 12B1000K / 24.5 GB106923
Mistral Nemo Kartoffel 12B1000K / 24.5 GB1833
Saiga Nemo 12b1000K / 24.5 GB36481337
MN 12B Mimicore GreenSnake1000K / 24.5 GB832
MN 12B Mimicore WhiteSnake1000K / 24.5 GB613
MN 12B Mag Mell R11000K / 24.5 GB424699
SauerkrautLM Nemo 12B Instruct1000K / 24.5 GB1952722
MN 12B Mimicore Orochi1000K / 24.5 GB312
Note: green Score (e.g. "73.2") means that the model is better than Xclbr7/Arcanum-12b.

Rank the Arcanum 12B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227