Linkbricks Horizon AI Korean Superb 27B by Saxo

 ยป  All LLMs  ยป  Saxo  ยป  Linkbricks Horizon AI Korean Superb 27B   URL Share it on

  Autotrain compatible Base model:finetune:google/gem... Base model:google/gemma-2-27b-...   Cn   Conversational   Dataset:kuotient/gsm8k-ko Dataset:lilacai/glaive-functio... Dataset:maywell/ko ultrafeedba... Dataset:saxo/en ko translation... Dataset:saxo/en ko translation...   Dataset:saxo/ko-news-corpus-1   Dataset:saxo/ko-news-corpus-2   Dataset:saxo/ko-news-corpus-3   Dataset:saxo/ko-news-corpus-4   Dataset:saxo/ko-news-corpus-5   Dataset:saxo/ko-news-corpus-6   Dataset:saxo/ko-news-corpus-7   Dataset:saxo/ko-news-corpus-8   Dataset:saxo/ko-news-corpus-9 Dataset:saxo/ko aspect sentime... Dataset:saxo/ko cn translation... Dataset:saxo/ko government qa ... Dataset:saxo/ko jp translation... Dataset:saxo/ko summarization ... Dataset:saxo/openorca cleaned ... Dataset:youjunhyeok/ko-orca-pa...   En   Endpoints compatible   Gemma2   Jp   Ko   Region:us   Safetensors   Sharded   Tensorflow

Linkbricks Horizon AI Korean Superb 27B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Linkbricks Horizon AI Korean Superb 27B (Saxo/Linkbricks-Horizon-AI-Korean-Superb-27B)

Linkbricks Horizon AI Korean Superb 27B Parameters and Internals

LLM NameLinkbricks Horizon AI Korean Superb 27B
Repository ๐Ÿค—https://huggingface.co/Saxo/Linkbricks-Horizon-AI-Korean-Superb-27B 
Base Model(s)  Gemma 2 27B It   google/gemma-2-27b-it
Model Size27b
Required VRAM54.8 GB
Updated2024-12-22
MaintainerSaxo
Model Typegemma2
Model Files  5.0 GB: 1-of-12   4.9 GB: 2-of-12   5.0 GB: 3-of-12   4.9 GB: 4-of-12   4.9 GB: 5-of-12   5.0 GB: 6-of-12   4.9 GB: 7-of-12   4.9 GB: 8-of-12   5.0 GB: 9-of-12   4.9 GB: 10-of-12   4.9 GB: 11-of-12   0.5 GB: 12-of-12
Supported Languagesko en jp cn
Model ArchitectureGemma2ForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.46.1
Tokenizer ClassGemmaTokenizer
Padding Token<pad>
Vocabulary Size256000
Torch Data Typebfloat16

Best Alternatives to Linkbricks Horizon AI Korean Superb 27B

Best Alternatives
Context / RAM
Downloads
Likes
SystemGemma2 27B It32K / 54.7 GB1082
SystemGemma2 27B It32K / 54.7 GB961
Gemma 2 27B It8K / 54.7 GB194695471
Gemma 2 27B8K / 108.3 GB641304191
Gemma2Crono 27B8K / 54.8 GB800
GemmaStock1 27B8K / 54.8 GB310
Gemma2Slerp3 27B8K / 54.8 GB310
Gemma2Slerp4 27B8K / 54.8 GB270
Magnum V3 27B Kto8K / 108.2 GB262611
Gemma2atlas 27B8K / 54.8 GB330
Note: green Score (e.g. "73.2") means that the model is better than Saxo/Linkbricks-Horizon-AI-Korean-Superb-27B.

Rank the Linkbricks Horizon AI Korean Superb 27B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217