Polyglot Ko 12.8B Instruct by etri-xainlp

 ยป  All LLMs  ยป  etri-xainlp  ยป  Polyglot Ko 12.8B Instruct   URL Share it on

  Autotrain compatible   Endpoints compatible   Gpt neox   Instruct   Ko   Pytorch   Region:us

Polyglot Ko 12.8B Instruct Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Polyglot Ko 12.8B Instruct (etri-xainlp/polyglot-ko-12.8b-instruct)

Polyglot Ko 12.8B Instruct Parameters and Internals

Model Type 
Instruction-based, Text generation
Additional Notes 
Model trained with Adam optimizer, linear learning rate scheduler.
Supported Languages 
Korean (High)
Training Details 
Data Sources:
Instruction-following dataset (260k)
Methodology:
Fine-tuning
Hardware Used:
multi-GPU(A100 80G)
Input Output 
Input Format:
text prompts
Accepted Modalities:
text
Output Format:
generated text
Performance Tips:
Use distributed multi-GPU setup for optimal performance.
LLM NamePolyglot Ko 12.8B Instruct
Repository ๐Ÿค—https://huggingface.co/etri-xainlp/polyglot-ko-12.8b-instruct 
Model Size12.8b
Required VRAM0.2 GB
Updated2025-02-05
Maintaineretri-xainlp
Model Typegpt_neox
Instruction-BasedYes
Model Files  26.0 GB   0.0 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB   0.2 GB   19.3 GB
Supported Languagesko
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.30.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size30003
Torch Data Typefloat16

Best Alternatives to Polyglot Ko 12.8B Instruct

Best Alternatives
Context / RAM
Downloads
Likes
Gollm 12.8B Instruct V2.32K / 25.9 GB62730
Polyglot Ko 12.8B Instruct2K / 25.9 GB31873
Gollm 12.8B Instruct V2.02K / 25.9 GB22860
...lm 12.8B Instruct Tendency T452K / 25.9 GB22970
Gollm 12.8B Instruct V2.12K / 25.9 GB690
...t Ko 12.8B Chang Instruct Chat2K / 25.9 GB231014
Note: green Score (e.g. "73.2") means that the model is better than etri-xainlp/polyglot-ko-12.8b-instruct.

Rank the Polyglot Ko 12.8B Instruct Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227