KoLLaVA KoVicuna 7B by tabtoyou

 ยป  All LLMs  ยป  tabtoyou  ยป  KoLLaVA KoVicuna 7B   URL Share it on

  Autotrain compatible   Clip Dataset:tabtoyou/kollava-cc3m-... Dataset:tabtoyou/kollava-instr...   Endpoints compatible   Instruct   Ko   Koalpaca   Kollava   Kovicuna   Llava   Pytorch   Region:us   Sharded

KoLLaVA KoVicuna 7B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
KoLLaVA KoVicuna 7B (tabtoyou/KoLLaVA-KoVicuna-7b)

KoLLaVA KoVicuna 7B Parameters and Internals

Model Type 
multimodal
Additional Notes 
The model combines KoVicuna with a visual encoder from CLIP (ViT-14) and is trained on Korean visual-instruction datasets.
Supported Languages 
ko (Korean)
Training Details 
Data Sources:
tabtoyou/KoLLaVA-Instruct-150k, tabtoyou/KoLLaVA-CC3M-Pretrain-595K
Hardware Used:
Multi-GPU (A100 80G)
Input Output 
Accepted Modalities:
text, image
LLM NameKoLLaVA KoVicuna 7B
Repository ๐Ÿค—https://huggingface.co/tabtoyou/KoLLaVA-KoVicuna-7b 
Model Size7b
Required VRAM27 GB
Updated2025-02-05
Maintainertabtoyou
Model Typellava
Instruction-BasedYes
Model Files  9.9 GB: 1-of-3   9.9 GB: 2-of-3   7.2 GB: 3-of-3   0.0 GB
Supported Languagesko
Model ArchitectureLlavaLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.28.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32003
Torch Data Typefloat32

Best Alternatives to KoLLaVA KoVicuna 7B

Best Alternatives
Context / RAM
Downloads
Likes
Llava V1.6 Mistral 7B PATCHED32K / 15.1 GB198
Table Llava V1.5 7B4K / 14.2 GB14711
Quilt Llava V1.5 7B4K / 14.2 GB29815
Co Instruct Llava V1.5 7B4K / 14.1 GB101
...ct4V LLaVA Instruct Mix880k 7B4K / 14.2 GB173
...V1.5 7b Qinstruct Preview V0.14K / 14.2 GB1174
Chinese LLaVA Baichuan2K / 28 GB168
Llava 1.6 Gptq 8bit32K / 9.6 GB50
Note: green Score (e.g. "73.2") means that the model is better than tabtoyou/KoLLaVA-KoVicuna-7b.

Rank the KoLLaVA KoVicuna 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227