Biomistral 7B Wo Kqa Golden Olaph Iter DPO Step1 by Minbyul

 ยป  All LLMs  ยป  Minbyul  ยป  Biomistral 7B Wo Kqa Golden Olaph Iter DPO Step1   URL Share it on

  Alignment-handbook   Autotrain compatible Base model:biomistral/biomistr...   Conversational Dataset:huggingfaceh4/ultrafee...   Dpo   Endpoints compatible   Generated from trainer   License:apache-2.0   Mistral   Region:us   Safetensors   Sharded   Tensorflow   Trl

Rank the Biomistral 7B Wo Kqa Golden Olaph Iter DPO Step1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Biomistral 7B Wo Kqa Golden Olaph Iter DPO Step1 (Minbyul/biomistral-7b-wo-kqa_golden-olaph-iter-dpo-step1)

Best Alternatives to Biomistral 7B Wo Kqa Golden Olaph Iter DPO Step1

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
KAI 7B V0.174.4532K / 14.4 GB549
Dolphin 2.2.1 Mistral 7B73.1732K / 14.4 GB11502185
Notus 7B V160.1532K / 14.4 GB8650112
Notus 7B V1 AWQ52.832K / 4.2 GB443
Notus 7B V1 GPTQ52.832K / 4.2 GB432
MegaBeam Mistral 7B 300K282K / 14.4 GB28058
Hebrew Mistral 7B 200K256K / 30 GB55214
Astral 256K 7B V2250K / 14.4 GB6610
Astral 256K 7B250K / 14.4 GB6540
Buddhi 128K Chat 7B128K / 14.4 GB364011
Note: green Score (e.g. "73.2") means that the model is better than Minbyul/biomistral-7b-wo-kqa_golden-olaph-iter-dpo-step1.

Biomistral 7B Wo Kqa Golden Olaph Iter DPO Step1 Parameters and Internals

LLM NameBiomistral 7B Wo Kqa Golden Olaph Iter DPO Step1
RepositoryOpen on ๐Ÿค— 
Base Model(s)  BioMistral 7B   BioMistral/BioMistral-7B
Model Size7b
Required VRAM14.4 GB
Updated2024-05-22
MaintainerMinbyul
Model Typemistral
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3   0.0 GB
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.39.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35549 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801