Yi 6B Yoruno Peft by nenekochan

 ยป  All LLMs  ยป  nenekochan  ยป  Yi 6B Yoruno Peft   URL Share it on

  Adapter   Finetuned   Lora   Peft   Region:us   Safetensors

Yi 6B Yoruno Peft Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Yi 6B Yoruno Peft Parameters and Internals

LLM NameYi 6B Yoruno Peft
Repository ๐Ÿค—https://huggingface.co/nenekochan/Yi-6B-yoruno-peft 
Model Size6b
Required VRAM0.6 GB
Updated2024-09-18
Maintainernenekochan
Model Files  0.6 GB   0.0 GB
Model ArchitectureAdapter
Is Biasednone
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesgate_proj|up_proj|down_proj|v_proj|k_proj|o_proj|q_proj
LoRA Alpha16
LoRA Dropout0.05
R Param64
Yi 6B Yoruno Peft (nenekochan/Yi-6B-yoruno-peft)

Best Alternatives to Yi 6B Yoruno Peft

Best Alternatives
Context / RAM
Downloads
Likes
Yi 1.5 6B Chat Sa V0.10K / 0 GB60
01 Ai Yi 1.5 6B 17193352360K / 0.4 GB50
01 Ai Yi 1.5 6B 17190983720K / 0.4 GB90
01 Ai Yi 1.5 6B 17189865160K / 0.4 GB60
Dreamtobenlpsama Mnlp M20K / 0 GB60
Trl Rm Tldr Gptj0K / 0 GB11
Chatglm3 6B Csc Chinese Lora0K / 0.1 GB36238
Yi 6b Chat Medical Qa Full0K / 0 GB21
Yi 6B Chat Finance Qa0K / 0 GB11
...i 6b Chat Medical Qa Full Beta0K / 0 GB11
Note: green Score (e.g. "73.2") means that the model is better than nenekochan/Yi-6B-yoruno-peft.

Rank the Yi 6B Yoruno Peft Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36026 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803