LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

TinyMistral V2.5 MiniPile Guidelines E1 by Dans-DiscountModels

What open-source LLMs or SLMs are you in search of? 18870 in total.

  Autotrain compatible Base model:locutusque/tinymist...   Dataset:epfl-llm/guidelines   Dataset:jeankaddour/minipile   En   Endpoints compatible   Generated from trainer   License:apache-2.0   Mistral   Pytorch   Region:us

Rank the TinyMistral V2.5 MiniPile Guidelines E1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
TinyMistral V2.5 MiniPile Guidelines E1 (Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1)

Best Alternatives to TinyMistral V2.5 MiniPile Guidelines E1

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Onnx TinyMistral 248M32K /  GB386
Onnx TinyMistral 248M V232K /  GB82
Onnx TinyMistral 248M SFT V432K /  GB130
TinyMistral 248M V2 Instruct32K / 0.5 GB3705
...al V2 Pycoder Instruct 248M V132K / 0.5 GB91
...istral Magicoder Instruct 248M32K / 0.5 GB31
TinyBagel 248M32K / 0.5 GB160
...mistral 248M Hypnosis Instruct32K / 0.5 GB110
Dans StructureEvaluator Small32K / 0.6 GB20220
TinyMistral 248M32K / 1 GB495228

TinyMistral V2.5 MiniPile Guidelines E1 Parameters and Internals

LLM NameTinyMistral V2.5 MiniPile Guidelines E1
RepositoryOpen on ๐Ÿค— 
Base Model(s)  TinyMistral 248M V2.5   Locutusque/TinyMistral-248M-v2.5
Model Size248m
Required VRAM0.6 GB
Updated2024-02-29
MaintainerDans-DiscountModels
Model Typemistral
Model Files  0.6 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.0
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32005
Initializer Range0.02
Torch Data Typebfloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003