TinyMistral 248M Instruct by Locutusque

 ยป  All LLMs  ยป  Locutusque  ยป  TinyMistral 248M Instruct   URL Share it on

  Autotrain compatible Base model:finetune:locutusque... Base model:locutusque/tinymist...   Dataset:berkeley-nest/nectar Dataset:locutusque/instructmix...   En   Endpoints compatible   Instruct   Mistral   Pytorch   Region:us   Safetensors

TinyMistral 248M Instruct Benchmarks

TinyMistral 248M Instruct (Locutusque/TinyMistral-248M-Instruct)

TinyMistral 248M Instruct Parameters and Internals

Model Type 
text-generation
Additional Notes 
During validation, this model achieved an average perplexity of 3.23 on Locutusque/InstructMix dataset. It has so far been trained on approximately 608,000 examples. More epochs are planned for this model.
Supported Languages 
en ()
Training Details 
Data Sources:
Locutusque/InstructMixCleaned, berkeley-nest/Nectar
Methodology:
Fully fine-tuned on Locutusque/InstructMix.
LLM NameTinyMistral 248M Instruct
Repository ๐Ÿค—https://huggingface.co/Locutusque/TinyMistral-248M-Instruct 
Base Model(s)  TinyMistral 248M   Locutusque/TinyMistral-248M
Model Size248m
Required VRAM1 GB
Updated2024-12-21
MaintainerLocutusque
Model Typemistral
Instruction-BasedYes
Model Files  1.0 GB   1.0 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.35.1
Tokenizer ClassLlamaTokenizer
Padding Token[PAD]
Vocabulary Size32005
Torch Data Typefloat32

Quantized Models of the TinyMistral 248M Instruct

Model
Likes
Downloads
VRAM
Tinymistralredq0380 GB
Tinymistralredq000 GB

Best Alternatives to TinyMistral 248M Instruct

Best Alternatives
Context / RAM
Downloads
Likes
TinyMistral 248M V2.5 Instruct32K / 1 GB2711
Tinymistv132K / 0.5 GB170
...istral 248M V2.5 Instruct Orpo32K / 0.5 GB190
TinyMistral 248M V2 Instruct32K / 0.5 GB467
...stral V2 Pycoder Instruct 248M32K / 1 GB232
...mistral 248M Hypnosis Instruct32K / 0.5 GB131
...al V2 Pycoder Instruct 248M V132K / 0.5 GB241
...istral Magicoder Instruct 248M32K / 0.5 GB182
Tinymistv132K / 0.5 GB00
TinyMistral 248M Chat V22K / 1 GB106426
Note: green Score (e.g. "73.2") means that the model is better than Locutusque/TinyMistral-248M-Instruct.

Rank the TinyMistral 248M Instruct Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217