IL BERAll Zephyr 7B Sft Full by TTTXXX01

 ยป  All LLMs  ยป  TTTXXX01  ยป  IL BERAll Zephyr 7B Sft Full   URL Share it on

  Alignment-handbook   Autotrain compatible Base model:tttxxx01/all like-z...   Conversational Dataset:huggingfaceh4/ultrafee...   Dpo   Endpoints compatible   Generated from trainer   License:apache-2.0   Mistral   Region:us   Safetensors   Sharded   Tensorflow   Trl

Rank the IL BERAll Zephyr 7B Sft Full Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
IL BERAll Zephyr 7B Sft Full (TTTXXX01/IL_BERAll-zephyr-7b-sft-full)

Best Alternatives to IL BERAll Zephyr 7B Sft Full

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
M7 7B76.8232K / 14.4 GB525515
J4rviz V3.076.5832K / 14.4 GB4070
Nexim 7B76.5332K / 14.4 GB4740
Calme 7B Instruct V0.376.532K / 14.4 GB14435
TriFusionNexus 7B76.3232K / 14.4 GB4730
Ramonda 7B DPO Ties76.1932K / 14.4 GB86010
OGNO 7B DPO Truthful76.1432K / 14.4 GB16771
Cyrax 7B75.9832K / 14.4 GB14199
NeuralTrix 7B DPO Laser75.9232K / 14.4 GB26696
Prima LelantaclesV6.69 7B75.732K / 14.5 GB4513
Note: green Score (e.g. "73.2") means that the model is better than TTTXXX01/IL_BERAll-zephyr-7b-sft-full.

IL BERAll Zephyr 7B Sft Full Parameters and Internals

LLM NameIL BERAll Zephyr 7B Sft Full
RepositoryOpen on ๐Ÿค— 
Base Model(s)  All Like Zephyr 7B Sft Full   TTTXXX01/All_like-zephyr-7b-sft-full
Model Size7b
Required VRAM14.4 GB
Updated2024-06-24
MaintainerTTTXXX01
Model Typemistral
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3   0.0 GB
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.41.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 34902 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801