Aurora SCE 12B V2 by yamatazen

 ยป  All LLMs  ยป  yamatazen  ยป  Aurora SCE 12B V2   URL Share it on

  Merged Model   Arxiv:2408.07990   Autotrain compatible Base model:cyberagent/mistral-... Base model:elizezen/himeyuri-v... Base model:inflatebot/mn-12b-m... Base model:nbeerbower/mistral-... Base model:nbeerbower/mistral-... Base model:neversleep/lumimaid... Base model:pocketdoc/dans-pers...   Chatml   Conversational   En   Endpoints compatible   Instruct   Ja   Mistral   Region:us   Safetensors   Sharded   Tensorflow

Aurora SCE 12B V2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Aurora SCE 12B V2 (yamatazen/Aurora-SCE-12B-v2)

Aurora SCE 12B V2 Parameters and Internals

LLM NameAurora SCE 12B V2
Repository ๐Ÿค—https://huggingface.co/yamatazen/Aurora-SCE-12B-v2 
Base Model(s)  MN 12B Mag Mell R1   NeverSleep/Lumimaid-v0.2-12B   Elizezen/Himeyuri-v0.1-12B   PocketDoc/Dans-PersonalityEngine-V1.1.0-12b   ...tral Nemo 12B Abliterated LORA   ...al Nemo Japanese Instruct 2408   Mistral Nemo Gutenberg 12B V4   inflatebot/MN-12B-Mag-Mell-R1   NeverSleep/Lumimaid-v0.2-12B   Elizezen/Himeyuri-v0.1-12B   PocketDoc/Dans-PersonalityEngine-V1.1.0-12b   nbeerbower/Mistral-Nemo-12B-abliterated-LORA   cyberagent/Mistral-Nemo-Japanese-Instruct-2408   nbeerbower/mistral-nemo-gutenberg-12B-v4
Merged ModelYes
Model Size12b
Required VRAM24.5 GB
Updated2025-03-14
Maintaineryamatazen
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Supported Languagesen ja
Model ArchitectureMistralForCausalLM
Context Length1024000
Model Max Length1024000
Transformers Version4.48.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad>
Vocabulary Size131072
Torch Data Typebfloat16

Best Alternatives to Aurora SCE 12B V2

Best Alternatives
Context / RAM
Downloads
Likes
...r Nemo 12B Instruct R 21 09 241000K / 24.5 GB6907114
SauerkrautLM Nemo 12B Instruct1000K / 24.5 GB502722
MN Slush1000K / 24.5 GB28122
Mistral Nemo Wissenschaft 12B1000K / 24.5 GB22128
Magnum V4 12B1000K / 24.5 GB49039
ChatWaifu V1.41000K / 24.5 GB6819
Mistral Nemo Bophades 12B1000K / 24.5 GB2459
ChatWaifu 12B V2.01000K / 24.5 GB6020
EtherealAurora 12B V21000K / 24.5 GB241
MN Anathema 12B1000K / 24.5 GB201
Note: green Score (e.g. "73.2") means that the model is better than yamatazen/Aurora-SCE-12B-v2.

Rank the Aurora SCE 12B V2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 45095 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227