Llava Jp 1.3B V1.0 by toshi456

 ยป  All LLMs  ยป  toshi456  ยป  Llava Jp 1.3B V1.0   URL Share it on

  Autotrain compatible Dataset:toshi456/llava-cc3m-pr... Dataset:turing-motors/llava-in...   Endpoints compatible   Image-captioning   Image-to-text   Instruct   Ja   Llava   Region:us   Safetensors   Sharded   Tensorflow   Vision   Vqa

Llava Jp 1.3B V1.0 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Llava Jp 1.3B V1.0 (toshi456/llava-jp-1.3b-v1.0)

Llava Jp 1.3B V1.0 Parameters and Internals

Model Type 
vision-language, image-captioning, VQA
Additional Notes 
The model uses a vision-language architecture designed to converse about input images.
Supported Languages 
Japanese (High)
Training Details 
Data Sources:
toshi456/LLaVA-CC3M-Pretrain-595K-JA, Japanese STAIR Captions, LLaVA-Instruct-150K-JA, Japanese Visual Genome VQA dataset
Methodology:
Fine-tuning using LLaVA method
Input Output 
Accepted Modalities:
image
Output Format:
Text description
LLM NameLlava Jp 1.3B V1.0
Repository ๐Ÿค—https://huggingface.co/toshi456/llava-jp-1.3b-v1.0 
Model Size1.3b
Required VRAM6.3 GB
Updated2025-02-22
Maintainertoshi456
Model Typellava
Instruction-BasedYes
Model Files  5.0 GB: 1-of-2   1.3 GB: 2-of-2
Supported Languagesja
Model ArchitectureLlavaGpt2ForCausalLM
Licensecc-by-nc-4.0
Model Max Length1532
Transformers Version4.35.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<unk|LLM-jp>
Vocabulary Size50688
Torch Data Typefloat32
Activation Functiongelu

Best Alternatives to Llava Jp 1.3B V1.0

Best Alternatives
Context / RAM
Downloads
Likes
Llava Jp 1.3B V1.10K / 6.6 GB70511
ConvLLaVA JP 1.3B 12800K / 7.1 GB191
ConvLLaVA JP 1.3B 7680K / 7.1 GB182
...3B V1.1 Llava Jp Instruct 108K0K / 6.6 GB93
...V1.0 Siglip So400m Patch14 3840K / 6.6 GB650
Note: green Score (e.g. "73.2") means that the model is better than toshi456/llava-jp-1.3b-v1.0.

Rank the Llava Jp 1.3B V1.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227