Cpt 16B Auto Sft Ties Post Merge Auto DPO by arcee-ai

 ยป  All LLMs  ยป  arcee-ai  ยป  Cpt 16B Auto Sft Ties Post Merge Auto DPO   URL Share it on

  Merged Model   Autotrain compatible   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow

Cpt 16B Auto Sft Ties Post Merge Auto DPO Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
๐ŸŒŸ Advertise your project ๐Ÿš€

Cpt 16B Auto Sft Ties Post Merge Auto DPO Parameters and Internals

LLM NameCpt 16B Auto Sft Ties Post Merge Auto DPO
Repository ๐Ÿค—https://huggingface.co/arcee-ai/cpt-16B-auto-sft-ties-post-merge-auto-dpo 
Merged ModelYes
Model Size16b
Required VRAM141.2 GB
Updated2024-12-02
Maintainerarcee-ai
Model Typellama
Model Files  4.0 GB: 1-of-37   3.6 GB: 2-of-37   3.9 GB: 3-of-37   3.9 GB: 4-of-37   3.9 GB: 5-of-37   3.7 GB: 6-of-37   3.9 GB: 7-of-37   3.9 GB: 8-of-37   3.9 GB: 9-of-37   3.7 GB: 10-of-37   3.9 GB: 11-of-37   3.9 GB: 12-of-37   3.9 GB: 13-of-37   3.7 GB: 14-of-37   3.9 GB: 15-of-37   3.9 GB: 16-of-37   3.9 GB: 17-of-37   3.7 GB: 18-of-37   3.9 GB: 19-of-37   3.9 GB: 20-of-37   3.9 GB: 21-of-37   3.7 GB: 22-of-37   3.9 GB: 23-of-37   3.9 GB: 24-of-37   3.9 GB: 25-of-37   3.7 GB: 26-of-37   3.9 GB: 27-of-37   3.9 GB: 28-of-37   3.9 GB: 29-of-37   3.7 GB: 30-of-37   3.9 GB: 31-of-37   3.9 GB: 32-of-37   3.9 GB: 33-of-37   3.7 GB: 34-of-37   3.9 GB: 35-of-37   3.9 GB: 36-of-37   2.6 GB: 37-of-37
Model ArchitectureLlamaForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.41.1
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size128256
Torch Data Typebfloat16
Cpt 16B Auto Sft Ties Post Merge Auto DPO (arcee-ai/cpt-16B-auto-sft-ties-post-merge-auto-dpo)

Best Alternatives to Cpt 16B Auto Sft Ties Post Merge Auto DPO

Best Alternatives
Context / RAM
Downloads
Likes
Phi 3.5 Mini Investigator 16B128K / 7.6 GB150
Nanbeige2 16B Chat4K / 31.6 GB11990
...ALAXY V03 Slimorca 1 Epoch 50k4K / 31.8 GB900
...ca 1 Epoch 50k DPO 1 Epoch 30k4K / 31.8 GB680
Nanbeige 16B Base Llama4K / 31.6 GB12763
GALAXY XB V.034K / 31.9 GB720
FusionNet SOLAR4K / 31.9 GB13031
Llama 2 16B Nastychat4K / 32.4 GB13008
Vinallama 16B Chat Franken4K / 31.6 GB111
Nanbeige 16B Chat4K / 31.6 GB2412
Note: green Score (e.g. "73.2") means that the model is better than arcee-ai/cpt-16B-auto-sft-ties-post-merge-auto-dpo.

Rank the Cpt 16B Auto Sft Ties Post Merge Auto DPO Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 38765 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124