Llava V1.5 7B M3 by mucai

 ยป  All LLMs  ยป  mucai  ยป  Llava V1.5 7B M3   URL Share it on

  Arxiv:2405.17430   Autotrain compatible   Llava llama   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/mucai/llava-v1.5-7b-m3 

Llava V1.5 7B M3 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Llava V1.5 7B M3 (mucai/llava-v1.5-7b-m3)

Llava V1.5 7B M3 Parameters and Internals

Model Type 
multimodal, chatbot, auto-regressive, transformer
Use Cases 
Areas:
research
Applications:
large multimodal models, chatbots
Primary Use Cases:
use of visual granularities
Additional Notes 
Model serves as a metric for image/dataset complexity. Uses explicit control of visual granularities.
Training Details 
Data Sources:
LAION, CC, SBU, LLaVA-1.5
Data Volume:
558K filtered image-text pairs, 665K image level instruction data
Methodology:
Fine-tuning on visual conversation data
Model Architecture:
Transformer
LLM NameLlava V1.5 7B M3
Repository ๐Ÿค—https://huggingface.co/mucai/llava-v1.5-7b-m3 
Model Size7b
Required VRAM14.1 GB
Updated2024-12-26
Maintainermucai
Model Typellava_llama
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.2 GB: 3-of-3   0.0 GB
Model ArchitectureLlavaLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Llava V1.5 7B M3

Best Alternatives
Context / RAM
Downloads
Likes
Llava Merged Finetuned 7b32K / 15.1 GB91
Llava V1.6 Mistral 7B PATCHED32K / 15.1 GB238
KoLLaVA V1.5 Synatra 7B32K / 15.1 GB3638
Llava V1.5 7B4K / 13.5 GB1206660393
Llava V1.6 Vicuna 7B4K / 14.1 GB38472104
LLaVA NeXT Video 7B4K / 14.1 GB77341
LLaVA NeXT Video 7B DPO4K / 14.2 GB103624
Table Llava V1.5 7B4K / 14.2 GB16811
Video LLaVA 7B4K / 15 GB1159181
...va 1.5 Image Classification V34K / 23.3 GB2150
Note: green Score (e.g. "73.2") means that the model is better than mucai/llava-v1.5-7b-m3.

Rank the Llava V1.5 7B M3 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40248 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217