TinyMistral 248M Chat V2 by Felladrin

 ยป  All LLMs  ยป  Felladrin  ยป  TinyMistral 248M Chat V2   URL Share it on

  Autotrain compatible Base model:locutusque/tinymist...   Conversational Dataset:cohereforai/aya datase... Dataset:databricks/databricks-... Dataset:euclaise/reddit-instru... Dataset:felladrin/chatml-aya d... Dataset:felladrin/chatml-capyb... Dataset:felladrin/chatml-datab... Dataset:felladrin/chatml-deita... Dataset:felladrin/chatml-openo... Dataset:felladrin/chatml-reddi... Dataset:felladrin/chatml-ultra...   Dataset:hkust-nlp/deita-10k-v0 Dataset:huggingfaceh4/ultracha...   Dataset:ldjnr/capybara   Dataset:open-orca/openorca   En   Endpoints compatible   Has space   Instruct   License:apache-2.0   Mistral   Model-index   Region:us   Safetensors

Rank the TinyMistral 248M Chat V2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
TinyMistral 248M Chat V2 (Felladrin/TinyMistral-248M-Chat-v2)

Best Alternatives to TinyMistral 248M Chat V2

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
TinyMistral 248M V2 Instruct32K / 0.5 GB506
...istral Magicoder Instruct 248M32K / 0.5 GB142
...al V2 Pycoder Instruct 248M V132K / 0.5 GB161
...mistral 248M Hypnosis Instruct32K / 0.5 GB220
Frankie Tiny32K / 0.6 GB240
TinyMistral 248M Instruct32K / 1 GB22598
...stral V2 Pycoder Instruct 248M32K / 1 GB352
Tinymistral Mediqa 248M32K / 1 GB180
...struct Oasst2 ChatML V1 DPO V332K / 1 GB130

TinyMistral 248M Chat V2 Parameters and Internals

LLM NameTinyMistral 248M Chat V2
RepositoryOpen on ๐Ÿค— 
Base Model(s)  TinyMistral 248M   Locutusque/TinyMistral-248M
Model Size248m
Required VRAM1 GB
Updated2024-04-18
MaintainerFelladrin
Model Typemistral
Instruction-BasedYes
Model Files  1 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Padding Token[PAD]
Vocabulary Size32005
Initializer Range0.02
Torch Data Typefloat32

What open-source LLMs or SLMs are you in search of? 35008 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024040901