Tinymistral V2 Pycoder Instruct 248M by jtatman

 ยป  All LLMs  ยป  jtatman  ยป  Tinymistral V2 Pycoder Instruct 248M   URL Share it on

  Autotrain compatible   Code Dataset:jtatman/pile python in... Dataset:jtatman/python-code-da... Dataset:jtatman/python-github-...   Endpoints compatible   Instruct   Mistral   Region:us   Safetensors

Tinymistral V2 Pycoder Instruct 248M Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Tinymistral V2 Pycoder Instruct 248M (jtatman/tinymistral-v2-pycoder-instruct-248m)

Tinymistral V2 Pycoder Instruct 248M Parameters and Internals

Model Type 
MistralForCausalLM
Use Cases 
Primary Use Cases:
Generate python code.
Additional Notes 
Model is in active development, base model is in active development, and all should be treated with caution.
Training Details 
Data Sources:
jtatman/python-code-dataset-500k, jtatman/python-github-code-instruct-filtered-5k, jtatman/pile_python_instruct_format
Methodology:
Conversion to alpaca/instruct format.
LLM NameTinymistral V2 Pycoder Instruct 248M
Repository ๐Ÿค—https://huggingface.co/jtatman/tinymistral-v2-pycoder-instruct-248m 
Model Size248m
Required VRAM1 GB
Updated2024-12-22
Maintainerjtatman
Model Typemistral
Instruction-BasedYes
Model Files  1.0 GB
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<|endoftext|>
Vocabulary Size32009
Torch Data Typefloat32

Best Alternatives to Tinymistral V2 Pycoder Instruct 248M

Best Alternatives
Context / RAM
Downloads
Likes
TinyMistral 248M V2.5 Instruct32K / 1 GB2711
Tinymistv132K / 0.5 GB170
TinyMistral 248M Instruct32K / 1 GB2811
...istral 248M V2.5 Instruct Orpo32K / 0.5 GB190
TinyMistral 248M V2 Instruct32K / 0.5 GB467
...mistral 248M Hypnosis Instruct32K / 0.5 GB131
...al V2 Pycoder Instruct 248M V132K / 0.5 GB241
...istral Magicoder Instruct 248M32K / 0.5 GB182
Tinymistv132K / 0.5 GB00
TinyMistral 248M Chat V22K / 1 GB90426
Note: green Score (e.g. "73.2") means that the model is better than jtatman/tinymistral-v2-pycoder-instruct-248m.

Rank the Tinymistral V2 Pycoder Instruct 248M Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217