Sakura 13B Galgame by sakuraumi

 ยป  All LLMs  ยป  sakuraumi  ยป  Sakura 13B Galgame   URL Share it on

  Autotrain compatible   Baichuan   Custom code   Endpoints compatible   Ja   Pytorch   Region:us   Sharded   Zh

Sakura 13B Galgame Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Sakura 13B Galgame (sakuraumi/Sakura-13B-Galgame)

Sakura 13B Galgame Parameters and Internals

Model Type 
text generation, translation
Use Cases 
Areas:
Research, Educational
Applications:
Translation of Galgames and light novels
Primary Use Cases:
Galgame and light novel translations
Limitations:
Not suitable for commercial use as stated in the CC BY-NC-SA 4.0 license
Considerations:
Ensure proper attribution when using translations publicly.
Additional Notes 
Model supports API backend compatible with OpenAI format. Ongoing updates to improve translation quality. Issues, feedback, and improvements are actively monitored.
Supported Languages 
Chinese (Advanced), Japanese (Advanced)
Training Details 
Data Sources:
Qwen model series, Qwen1.5 model series
Methodology:
Finetuning
Input Output 
Input Format:
Prompts formatted for text translation
Accepted Modalities:
text
Output Format:
Translated text in simplified Chinese
Performance Tips:
Ensure using the model versioning to attain consistent quality.
Release Notes 
Version:
v0.9
Date:
2024-02-13
Notes:
Enhanced quality with more stable output compared to earlier beta versions and was pre-trained on larger datasets.
LLM NameSakura 13B Galgame
Repository ๐Ÿค—https://huggingface.co/sakuraumi/Sakura-13B-Galgame 
Model Size13b
Required VRAM27.8 GB
Updated2024-12-08
Maintainersakuraumi
Model Typebaichuan
Model Files  10.0 GB: 1-of-3   9.9 GB: 2-of-3   7.9 GB: 3-of-3
Supported Languageszh ja
Model ArchitectureBaichuanForCausalLM
Licenseapache-2.0
Model Max Length4096
Transformers Version4.33.2
Tokenizer ClassBaichuanTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size125696
Torch Data Typebfloat16

Best Alternatives to Sakura 13B Galgame

Best Alternatives
Context / RAM
Downloads
Likes
Tiny Random Baichuan2 13B0K / 0.1 GB1817730
Baichuan2 13B Chat0K / 27.8 GB101259425
Baichuan 13B Chat0K / 26.5 GB3927630
ShieldLM 13B Baichuan20K / 27.8 GB593
Blossom V3.1 Baichuan2 13B0K / 27.8 GB131
HuatuoGPT2 13B0K / 29.1 GB495
Baichuan2 13B Base0K / 27.8 GB99077
Buffer Baichuan2 13B Rag 4bits0K / 9.9 GB140
Buffer Baichuan2 13B Rag0K / 27.8 GB121
... Efficient Training Of LLMs V10K / 29.1 GB101
Note: green Score (e.g. "73.2") means that the model is better than sakuraumi/Sakura-13B-Galgame.

Rank the Sakura 13B Galgame Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 39016 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124