Polyglot Ko Empathy Chat 5.8B by j5ng

 Β»  All LLMs  Β»  j5ng  Β»  Polyglot Ko Empathy Chat 5.8B   URL Share it on

  Autotrain compatible   Chat   Endpoints compatible   Gpt neox   Ko   Region:us   Safetensors   Sharded   Tensorflow

Polyglot Ko Empathy Chat 5.8B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Polyglot Ko Empathy Chat 5.8B (j5ng/polyglot-ko-empathy-chat-5.8b)

Polyglot Ko Empathy Chat 5.8B Parameters and Internals

Model Type 
chatbot, empathy, text generation
Use Cases 
Areas:
research, chat applications
Applications:
empathy-driven interactions, relationship counseling
Primary Use Cases:
providing virtual emotional support, enhanced understanding in relationships
Limitations:
not suitable for professional psychological counseling, may not cover special relationship scenarios
Considerations:
usage primarily intended for entertainment and casual conversation
Additional Notes 
The model is trained specifically on dialogue data catered towards communication in romantic contexts, enhancing its empathy-driven interactions.
Supported Languages 
Korean (high proficiency)
Training Details 
Data Sources:
AIν—ˆλΈŒ κ³΅κ°ν˜• λŒ€ν™”, 연인 데이터
Methodology:
QLoRA νŒŒμΈνŠœλ‹
Hardware Used:
8GB 이상 VRAM
Model Architecture:
EleutherAI/polyglot-ko-5.8b
Input Output 
Input Format:
text input from users in casual conversation style
Accepted Modalities:
text
Output Format:
text-based empathic responses
Release Notes 
Version:
1.0
Date:
unknown
Notes:
Initial release of the chatbot model for empathic communication in relationships.
LLM NamePolyglot Ko Empathy Chat 5.8B
Repository πŸ€—https://huggingface.co/j5ng/polyglot-ko-empathy-chat-5.8b 
Model Size5.8b
Required VRAM11.9 GB
Updated2025-03-20
Maintainerj5ng
Model Typegpt_neox
Model Files  5.0 GB: 1-of-3   5.0 GB: 2-of-3   1.9 GB: 3-of-3
Supported Languagesko
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.36.0.dev0
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size30080
Torch Data Typefloat16

Best Alternatives to Polyglot Ko Empathy Chat 5.8B

Best Alternatives
Context / RAM
Downloads
Likes
KoAlpaca Polyglot 5.8B2K / 11.7 GB4626261
Ai Script22K / 3.8 GB60
Ai Script2K / 3.8 GB60
KIT 5.8B2K / 23.4 GB63
KIT 5.8B2K / 23.5 GB20700
Polyglot Ko 5.8B Inst2K / 23.6 GB20550
ChatSKKU5.8B2K / 11.3 GB20930
KoQuality Polyglot 5.8B2K / 23.6 GB20883
...olyglot 5.8B V2 Koalpaca V1.1B2K / 23.6 GB20600
...olyglot 5.8B V2 Koalpaca V1.1B2K / 23.6 GB20580
Note: green Score (e.g. "73.2") means that the model is better than j5ng/polyglot-ko-empathy-chat-5.8b.

Rank the Polyglot Ko Empathy Chat 5.8B Capabilities

πŸ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 45333 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227