Rlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K by pxyyy

 ยป  All LLMs  ยป  pxyyy  ยป  Rlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K   URL Share it on

  Arxiv:1910.09700   Autotrain compatible   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow

Rlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Rlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K (pxyyy/rlhflow_mixture_clean_empty_round_with_dart_scalebiosampled-600k)

Rlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K Parameters and Internals

LLM NameRlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K
Repository ๐Ÿค—https://huggingface.co/pxyyy/rlhflow_mixture_clean_empty_round_with_dart_scalebiosampled-600k 
Model Size8b
Required VRAM16.1 GB
Updated2025-03-22
Maintainerpxyyy
Model Typellama
Model Files  5.0 GB: 1-of-4   5.0 GB: 2-of-4   4.9 GB: 3-of-4   1.2 GB: 4-of-4
Model ArchitectureLlamaForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.44.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to Rlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K

Best Alternatives
Context / RAM
Downloads
Likes
...a 3 8B Instruct Gradient 1048K1024K / 16.1 GB5116682
A181024K / 16.1 GB2720
A121024K / 16.1 GB2560
C311024K / 16.1 GB1830
B51024K / 16.1 GB1470
A151024K / 16.1 GB1600
A51024K / 16.1 GB1500
A131024K / 16.1 GB2360
C351024K / 16.1 GB2360
A81024K / 16.1 GB1720
Note: green Score (e.g. "73.2") means that the model is better than pxyyy/rlhflow_mixture_clean_empty_round_with_dart_scalebiosampled-600k.

Rank the Rlhflow Mixture Clean Empty Round With Dart Scalebiosampled 600K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 45429 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227