0.0 Llama Nodpo 3iters Bs128 531lr Iter 1 by ZhangShenao

 ยป  All LLMs  ยป  ZhangShenao  ยป  0.0 Llama Nodpo 3iters Bs128 531lr Iter 1   URL Share it on

  Alignment-handbook   Autotrain compatible Base model:meta-llama/meta-lla...   Conversational   Dataset:original   Dataset:updated   Dpo   Endpoints compatible   Generated from trainer   Instruct   License:other   Llama   Region:us   Safetensors   Sharded   Tensorflow   Trl

Rank the 0.0 Llama Nodpo 3iters Bs128 531lr Iter 1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
0.0 Llama Nodpo 3iters Bs128 531lr Iter 1 (GeorgiaTech/0.0_llama_nodpo_3iters_bs128_531lr_iter_1)

Best Alternatives to 0.0 Llama Nodpo 3iters Bs128 531lr Iter 1

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...a 3 8B Instruct Gradient 1048K1024K / 16.1 GB30280558
Llama 3 8B Instruct V41 1048K1024K / 16.1 GB33
Llama 3 8B Instruct 1048K1024K / 16.1 GB33
... V0.1.0 Llama 3 8B Instruct 1M1024K / 16.1 GB1431
... Instruct Gradient 1048K Agent1024K / 16.1 GB90
...radient 1M OpenBio Stone L3 8B1024K / 16.1 GB60
Aria Daughter 128K256K / 14.5 GB160
Llama3 8B Slerp Med 262K256K / 16 GB23490
Luna 8B Instruct 262K256K / 16 GB370
...miLuminRP 8B Instruct 262K 0.4256K / 16 GB90

0.0 Llama Nodpo 3iters Bs128 531lr Iter 1 Parameters and Internals

LLM Name0.0 Llama Nodpo 3iters Bs128 531lr Iter 1
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Meta Llama 3 8B Instruct   meta-llama/Meta-Llama-3-8B-Instruct
Model Size8b
Required VRAM16.1 GB
Updated2024-05-20
MaintainerZhangShenao
Model Typellama
Instruction-BasedYes
Model Files  5.0 GB: 1-of-4   5.0 GB: 2-of-4   4.9 GB: 3-of-4   1.2 GB: 4-of-4   0.0 GB
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|eot_id|>
Vocabulary Size128256
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 34817 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801