FuseChat 7B VaRM by FuseAI

 ยป  All LLMs  ยป  FuseAI  ยป  FuseChat 7B VaRM   URL Share it on

  Arxiv:2402.16107   Autotrain compatible Base model:finetune:openchat/o... Base model:openchat/openchat 3...   Conversational Dataset:fuseai/fusechat-mixtur...   En   Endpoints compatible   Fusechat   Mistral   Mixtral   Model-fusion   Model-index   Pytorch   Region:us   Safetensors   Solar
Model Card on HF ๐Ÿค—: https://huggingface.co/FuseAI/FuseChat-7B-VaRM 

FuseChat 7B VaRM Benchmarks

FuseChat 7B VaRM (FuseAI/FuseChat-7B-VaRM)

FuseChat 7B VaRM Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
chat applications, text generation
Applications:
commercial applications, research
Primary Use Cases:
multi-modal chat, roleplay, reasoning, math, coding, writing
Additional Notes 
Includes techniques from mixtral and solar to enhance performance.
Supported Languages 
en (fluent)
Training Details 
Data Sources:
FuseAI/FuseChat-Mixture, human-written, model-generated
Methodology:
Pairwise knowledge fusion and model merging
Context Length:
2048
Model Architecture:
Fuse and merge strategy with multiple source LLMs
Input Output 
Input Format:
Text input with proper prompts
Accepted Modalities:
text
Output Format:
Generated text
Performance Tips:
Ensure GPU availability for better performance
Release Notes 
Version:
7B-VaRM
Notes:
Fusion of three chat LLMs with diverse architectures and scales. Achieves high performance on various benchmarks.
LLM NameFuseChat 7B VaRM
Repository ๐Ÿค—https://huggingface.co/FuseAI/FuseChat-7B-VaRM 
Base Model(s)  openchat/openchat_3.5   openchat/openchat_3.5
Model Size7b
Required VRAM14.5 GB
Updated2025-02-22
MaintainerFuseAI
Model Typemistral
Model Files  14.5 GB   14.5 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.36.0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32002
Torch Data Typebfloat16

Quantized Models of the FuseChat 7B VaRM

Model
Likes
Downloads
VRAM
FuseChat 7B VaRM AWQ174 GB

Best Alternatives to FuseChat 7B VaRM

Best Alternatives
Context / RAM
Downloads
Likes
...Nemo Instruct 2407 Abliterated1000K / 24.5 GB462011
MegaBeam Mistral 7B 512K512K / 14.4 GB568150
SpydazWeb AI HumanAI RP512K / 14.4 GB121
SpydazWeb AI HumanAI 002512K / 14.4 GB181
...daz Web AI ChatML 512K Project512K / 14.5 GB120
MegaBeam Mistral 7B 300K282K / 14.4 GB563316
Hebrew Mistral 7B 200K256K / 30 GB1461915
Astral 256K 7B V2250K / 14.4 GB70
Astral 256K 7B250K / 14.4 GB50
Test001128K / 14.5 GB90
Note: green Score (e.g. "73.2") means that the model is better than FuseAI/FuseChat-7B-VaRM.

Rank the FuseChat 7B VaRM Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227