Finetuned Mistral On Ads by not-lain

 ยป  All LLMs  ยป  not-lain  ยป  Finetuned Mistral On Ads   URL Share it on

  Adapter Base model:adapter:mistralai/m... Base model:mistralai/mistral-7...   Finetuned   Generated from trainer   Instruct   Lora   Peft   Region:us   Safetensors   Sft   Tensorboard   Trl

Finetuned Mistral On Ads Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Finetuned Mistral On Ads (not-lain/finetuned_mistral_on_ads)

Finetuned Mistral On Ads Parameters and Internals

LLM NameFinetuned Mistral On Ads
Repository ๐Ÿค—https://huggingface.co/not-lain/finetuned_mistral_on_ads 
Base Model(s)  mistralai/Mistral-7B-Instruct-v0.3   mistralai/Mistral-7B-Instruct-v0.3
Model Size7b
Required VRAM0 GB
Updated2024-11-12
Maintainernot-lain
Instruction-BasedYes
Model Files  0.0 GB   0.0 GB
Model ArchitectureAdapter
Licenseapache-2.0
Is Biasednone
Tokenizer ClassLlamaTokenizer
Padding Token</s>
PEFT TypeLORA
LoRA ModelYes
PEFT Target Moduleso_proj|gate_proj|v_proj|k_proj|q_proj|down_proj|up_proj
LoRA Alpha8
LoRA Dropout0
R Param8

Quantized Models of the Finetuned Mistral On Ads

Model
Likes
Downloads
VRAM
Mistral 7B Instruct V0.3 GPTQ0114 GB
...stral 7B Instruct V0.3 GPTQ 8B0137 GB
...stral 7B Instruct V0.3 GPTQ 4B094 GB

Best Alternatives to Finetuned Mistral On Ads

Best Alternatives
Context / RAM
Downloads
Likes
Qwen Megumin0K / 0.1 GB1380
Mistral 7B Instruct Sa V0.10K / 0 GB70
...Sql Flash Attention 2 Dataeval0K / 1.9 GB583
...82 6142 45d8 9455 Bc68ca4866eb0K / 1.2 GB60
Text To Rule Mistral 20K / 0.3 GB60
...al 7B Instruct V0.3 17193012560K / 0.9 GB90
...al 7B Instruct V0.3 17192977500K / 0.4 GB60
Text To Rule Mistral0K / 0.4 GB60
...al 7B Instruct V0.3 17193413250K / 1.7 GB60
...al 7B Instruct V0.3 17193444300K / 3.5 GB60
Note: green Score (e.g. "73.2") means that the model is better than not-lain/finetuned_mistral_on_ads.

Rank the Finetuned Mistral On Ads Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217