Mixture Of Llamas Dare Ties by johnsutor

 ยป  All LLMs  ยป  johnsutor  ยป  Mixture Of Llamas Dare Ties   URL Share it on

  Merged Model   Arxiv:2306.01708   Arxiv:2311.03099   Autotrain compatible Base model:deepmount00/llama-3... Base model:failspy/meta-llama-... Base model:jpacifico/french-al... Base model:meta-llama/meta-lla... Base model:nbeerbower/llama-3-... Base model:vagosolutions/llama...   Conversational   Endpoints compatible   Instruct   Llama   Region:us   Safetensors   Sharded   Tensorflow

Mixture Of Llamas Dare Ties Benchmarks

Mixture Of Llamas Dare Ties (johnsutor/mixture-of-llamas-dare-ties)

Mixture Of Llamas Dare Ties Parameters and Internals

Additional Notes 
This model is a merge of several pre-trained language models created using the DARE-TIES merge method with transformers and mergekit.
LLM NameMixture Of Llamas Dare Ties
Repository ๐Ÿค—https://huggingface.co/johnsutor/mixture-of-llamas-dare-ties 
Base Model(s)  Llama 3 Gutenberg 8B   ...ama 3 SauerkrautLM 8B Instruct   Llama 3 8B Ita   Meta Llama 3 8B Instruct   jpacifico/French-Alpaca-Llama3-8B-Instruct-v1.0   ...a 3 8B Instruct Abliterated V3   nbeerbower/llama-3-gutenberg-8B   VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct   DeepMount00/Llama-3-8b-Ita   meta-llama/Meta-Llama-3-8B-Instruct   jpacifico/French-Alpaca-Llama3-8B-Instruct-v1.0   failspy/Meta-Llama-3-8B-Instruct-abliterated-v3
Merged ModelYes
Model Size8b
Required VRAM16 GB
Updated2025-02-22
Maintainerjohnsutor
Model Typellama
Instruction-BasedYes
Model Files  9.9 GB: 1-of-2   6.1 GB: 2-of-2
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to Mixture Of Llamas Dare Ties

Best Alternatives
Context / RAM
Downloads
Likes
...a 3 8B Instruct Gradient 1048K1024K / 16.1 GB3927680
Mpasila Viking 8B1024K / 16.1 GB840
Hel V2 8B DARK FICTION1024K / 16.1 GB220
161024K / 16.1 GB1690
...di95 LewdStorytellerMix 8B 64K1024K / 16.1 GB692
Because Im Bored Nsfw11024K / 16.1 GB361
121024K / 16.1 GB600
MrRoboto ProLong 8B V4b1024K / 16.1 GB1070
MrRoboto ProLong 8B V1a1024K / 16.1 GB1080
MrRoboto ProLong 8B V2a1024K / 16.1 GB1020
Note: green Score (e.g. "73.2") means that the model is better than johnsutor/mixture-of-llamas-dare-ties.

Rank the Mixture Of Llamas Dare Ties Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227