Gollm 12.8B Instruct V2.0 by tlphams

 ยป  All LLMs  ยป  tlphams  ยป  Gollm 12.8B Instruct V2.0   URL Share it on

  Autotrain compatible Base model:eleutherai/polyglot... Base model:finetune:eleutherai...   Endpoints compatible   Generated from trainer   Gpt neox   Instruct   Pytorch   Region:us   Sharded

Gollm 12.8B Instruct V2.0 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Gollm 12.8B Instruct V2.0 (tlphams/gollm-12.8b-instruct-v2.0)

Gollm 12.8B Instruct V2.0 Parameters and Internals

Training Details 
Data Sources:
self-introduction (20 samples), Combined KoAlpaca and KULLM - no-context samples only (145.8k samples), KoAlpaca v1.0, KoAlpaca v1.1, KULLM (Dolly and Vicuna only), Naver news summarization (22.2k samples), KLUE MRC (17.5k samples), KLUE STS (5.6k samples)
LLM NameGollm 12.8B Instruct V2.0
Repository ๐Ÿค—https://huggingface.co/tlphams/gollm-12.8b-instruct-v2.0 
Base Model(s)  EleutherAI/polyglot-ko-12.8b   EleutherAI/polyglot-ko-12.8b
Model Size12.8b
Required VRAM25.9 GB
Updated2025-02-05
Maintainertlphams
Model Typegpt_neox
Instruction-BasedYes
Model Files  10.0 GB: 1-of-3   9.9 GB: 2-of-3   6.0 GB: 3-of-3
Model ArchitectureGPTNeoXForCausalLM
Licensecc-by-nc-sa-4.0
Context Length2048
Model Max Length2048
Transformers Version4.32.0.dev0
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size30080
Torch Data Typefloat16

Best Alternatives to Gollm 12.8B Instruct V2.0

Best Alternatives
Context / RAM
Downloads
Likes
Gollm 12.8B Instruct V2.32K / 25.9 GB62730
Polyglot Ko 12.8B Instruct2K / 25.9 GB31873
Polyglot Ko 12.8B Instruct2K / 0.2 GB23012
...lm 12.8B Instruct Tendency T452K / 25.9 GB22970
Gollm 12.8B Instruct V2.12K / 25.9 GB690
...t Ko 12.8B Chang Instruct Chat2K / 25.9 GB231014
Note: green Score (e.g. "73.2") means that the model is better than tlphams/gollm-12.8b-instruct-v2.0.

Rank the Gollm 12.8B Instruct V2.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227