Crimson Dawn V0.2 by Epiculous

 ยป  All LLMs  ยป  Epiculous  ยป  Crimson Dawn V0.2   URL Share it on

  Conversational Dataset:anthracite-org/kalo-op... Dataset:anthracite-org/kalo op... Dataset:anthracite-org/nopm cl... Dataset:anthracite-org/stheno-... Dataset:epiculous/synthrp-gens... Dataset:epiculous/synthstruct-... Dataset:gryphe/sonnet3.5-charc... Dataset:pjmixers/hieunguyenmin...   De   En   Es   Fr   Instruct   It   Ja   Mistral   Model-index   Pt   Region:us   Ru   Safetensors   Sharded   Tensorflow   Zh

Crimson Dawn V0.2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Crimson Dawn V0.2 (Epiculous/Crimson_Dawn-v0.2)

Crimson Dawn V0.2 Parameters and Internals

Model Type 
text generation
Additional Notes 
Training was done twice over 2 epochs each on two 2x NVIDIA A6000 GPUs using LoRA. A two-phased approach was used.
Supported Languages 
en (proficient), fr (proficient), de (proficient), es (proficient), it (proficient), pt (proficient), ru (proficient), zh (proficient), ja (proficient)
Training Details 
Data Sources:
Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned, anthracite-org/stheno-filtered-v1.1, PJMixers/hieunguyenminh_roleplay-deduped-ShareGPT, Gryphe/Sonnet3.5-Charcard-Roleplay, Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned, anthracite-org/kalo-opus-instruct-22k-no-refusal, anthracite-org/nopm_claude_writing_fixed, anthracite-org/kalo_opus_misc_240827
Methodology:
RSLoRA
Hardware Used:
2x NVIDIA A6000 GPUs
Model Architecture:
LoRA
Input Output 
Input Format:
ChatML
Accepted Modalities:
text
Output Format:
text
Performance Tips:
Use ChatML Context and Instruct template.
LLM NameCrimson Dawn V0.2
Repository ๐Ÿค—https://huggingface.co/Epiculous/Crimson_Dawn-v0.2 
Model Size12.2b
Required VRAM24.5 GB
Updated2025-02-05
MaintainerEpiculous
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Supported Languagesen fr de es it pt ru zh ja
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length1024000
Model Max Length1024000
Transformers Version4.44.0
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad>
Vocabulary Size131072
Torch Data Typebfloat16

Best Alternatives to Crimson Dawn V0.2

Best Alternatives
Context / RAM
Downloads
Likes
Violet Twilight V0.21000K / 24.5 GB35868323
...ish Mistral Nemo Instruct 24071000K / 24.5 GB772
Educa Ai Nemo Sft1000K / 49.3 GB7693
Mistral Nemo Kurdish1000K / 24.5 GB273
...al Nemo Japanese Instruct 24081000K / 24.5 GB428733
Azure Dusk V0.21000K / 24.5 GB367
10 14dpo1000K / 49.3 GB7601
...ike Mistral Nemo Instruct 24071000K / 24.5 GB2439
ChatML Nemo Pro1000K / 24.5 GB112
...l Nemo Abliterated Nemo Pro V21000K / 24.5 GB150
Note: green Score (e.g. "73.2") means that the model is better than Epiculous/Crimson_Dawn-v0.2.

Rank the Crimson Dawn V0.2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227