EEVE Korean Instruct 2.8B V1.0 by yanolja

 ยป  All LLMs  ยป  yanolja  ยป  EEVE Korean Instruct 2.8B V1.0   URL Share it on

  Arxiv:2306.02707   Arxiv:2310.01377   Arxiv:2402.14714   Autotrain compatible Base model:finetune:yanolja/ee... Base model:yanolja/eeve-korean...   Conversational   Custom code   Endpoints compatible   Generated from trainer   Instruct   Phi   Region:us   Safetensors   Sharded   Tensorflow

EEVE Korean Instruct 2.8B V1.0 Benchmarks

EEVE Korean Instruct 2.8B V1.0 (yanolja/EEVE-Korean-Instruct-2.8B-v1.0)

EEVE Korean Instruct 2.8B V1.0 Parameters and Internals

Model Type 
text generation
Supported Languages 
Korean (high proficiency)
Training Details 
Data Sources:
Korean-translated version of Open-Orca/SlimOrca-Dedup, Korean-translated version of argilla/ultrafeedback-binarized-preferences-cleaned
Methodology:
Direct Preference Optimization (DPO)
Hardware Used:
Axolotl
Model Architecture:
A fine-tuned version of yanolja/EEVE-Korean-2.8B-v1.0 with extended Korean vocabulary.
Input Output 
Input Format:
A chat between a curious user and an AI assistant format
Accepted Modalities:
text
Output Format:
Text - AI Assistive Response
LLM NameEEVE Korean Instruct 2.8B V1.0
Repository ๐Ÿค—https://huggingface.co/yanolja/EEVE-Korean-Instruct-2.8B-v1.0 
Base Model(s)  yanolja/EEVE-Korean-2.8B-v1.0   yanolja/EEVE-Korean-2.8B-v1.0
Model Size2.8b
Required VRAM5.7 GB
Updated2024-12-21
Maintaineryanolja
Model Typephi
Instruction-BasedYes
Model Files  5.0 GB: 1-of-2   0.7 GB: 2-of-2
Model ArchitecturePhiForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.38.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size58944
Torch Data Typebfloat16

Best Alternatives to EEVE Korean Instruct 2.8B V1.0

Best Alternatives
Context / RAM
Downloads
Likes
Phi 2 Instruct V0.12K / 5.6 GB3502
Phi 2 Instruct Apo2K / 5.6 GB330
Att Model2K / 5.7 GB90
Eeve2.8 Base2K / 5.7 GB50
Eeve2.8 Ko2K / 5.7 GB180
... Instruct 2.8B V1.0 20240430 22K / 2.9 GB120
Phi 2 Code Instruct2K / 5.6 GB204
Dolphin 2 6 Phi 20K / 5.6 GB658192
Phi 2 Evol Instruct Chinese0K / 5.6 GB04
Phi 2 Dolly Instruction Polish0K / 5.6 GB02
Note: green Score (e.g. "73.2") means that the model is better than yanolja/EEVE-Korean-Instruct-2.8B-v1.0.

Rank the EEVE Korean Instruct 2.8B V1.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217