Pearl 34B Ties by louisbrulenaudet

 ยป  All LLMs  ยป  louisbrulenaudet  ยป  Pearl 34B Ties   URL Share it on

  Merged Model Abacusai/metamath-bagel-dpo-34...   Autotrain compatible Base model:abacusai/metamath-b... Base model:jondurbin/bagel-dpo...   Conversational   En   Endpoints compatible   Jondurbin/bagel-dpo-34b-v0.2   Llama   Model-index   Region:us   Safetensors   Sharded   Tensorflow

Pearl 34B Ties Benchmarks

Pearl 34B Ties (louisbrulenaudet/Pearl-34B-ties)

Pearl 34B Ties Parameters and Internals

Model Type 
text-generation
Additional Notes 
TIES-Merging process includes Trim, Elect Sign, and Disjoint Merge steps.
Training Details 
Methodology:
TIES-Merging, a method designed to facilitate the efficient merging of multiple task-specific models into a consolidated multitask model. This involves identifying and eliminating redundant parameters and resolving sign conflicts across different models.
LLM NamePearl 34B Ties
Repository ๐Ÿค—https://huggingface.co/louisbrulenaudet/Pearl-34B-ties 
Base Model(s)  jondurbin/bagel-dpo-34b-v0.2   MetaMath Bagel DPO 34B   jondurbin/bagel-dpo-34b-v0.2   abacusai/MetaMath-Bagel-DPO-34B
Merged ModelYes
Model Size34b
Required VRAM67.8 GB
Updated2024-12-14
Maintainerlouisbrulenaudet
Model Typellama
Model Files  1.9 GB: 1-of-36   1.9 GB: 2-of-36   1.7 GB: 3-of-36   2.0 GB: 4-of-36   1.9 GB: 5-of-36   1.9 GB: 6-of-36   1.9 GB: 7-of-36   1.9 GB: 8-of-36   1.9 GB: 9-of-36   2.0 GB: 10-of-36   1.9 GB: 11-of-36   2.0 GB: 12-of-36   1.9 GB: 13-of-36   1.9 GB: 14-of-36   1.9 GB: 15-of-36   1.9 GB: 16-of-36   1.9 GB: 17-of-36   2.0 GB: 18-of-36   1.9 GB: 19-of-36   1.9 GB: 20-of-36   2.0 GB: 21-of-36   1.9 GB: 22-of-36   1.9 GB: 23-of-36   1.9 GB: 24-of-36   2.0 GB: 25-of-36   1.9 GB: 26-of-36   1.9 GB: 27-of-36   1.9 GB: 28-of-36   2.0 GB: 29-of-36   1.9 GB: 30-of-36   1.9 GB: 31-of-36   1.9 GB: 32-of-36   2.0 GB: 33-of-36   1.9 GB: 34-of-36   1.4 GB: 35-of-36   1.2 GB: 36-of-36
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length200000
Model Max Length200000
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size64000
Torch Data Typebfloat16

Best Alternatives to Pearl 34B Ties

Best Alternatives
Context / RAM
Downloads
Likes
34B Beta195K / 69.2 GB371262
Bagel Hermes 34B Slerp195K / 68.9 GB40881
Yi 34B 200K195K / 68.9 GB4552316
Smaug 34B V0.1195K / 69.2 GB371760
Bagel 34B V0.2195K / 68.7 GB578339
Yi 34B 200K AEZAKMI V2195K / 69.2 GB129312
Smaug 34B V0.1 ExPO195K / 69.2 GB30170
Faro Yi 34B195K / 69.2 GB38426
Mergekit Slerp Anaazls195K / 69.2 GB70
Bagel DPO 34B V0.5195K / 68.7 GB306117
Note: green Score (e.g. "73.2") means that the model is better than louisbrulenaudet/Pearl-34B-ties.

Rank the Pearl 34B Ties Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 39237 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124