LLM Name | Mis |
Repository ๐ค | https://huggingface.co/bachngo/mis |
Model Size | 7.3b |
Required VRAM | 14.6 GB |
Updated | 2025-04-07 |
Maintainer | bachngo |
Model Type | mistral |
Model Files | |
Model Architecture | MistralForCausalLM |
Context Length | 32768 |
Model Max Length | 32768 |
Transformers Version | 4.34.0 |
Tokenizer Class | LlamaTokenizer |
Padding Token | <unk> |
Vocabulary Size | 38369 |
Torch Data Type | bfloat16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Dictalm2.0 Instruct | 32K / 14.5 GB | 9091 | 22 |
Dictalm2.0 | 32K / 14.5 GB | 12085 | 16 |
My Tokenizer 100 10000 | 32K / 22.3 GB | 11 | 0 |
My Tokenizer 50 20000 | 32K / 14.6 GB | 7 | 0 |
Dictalm2 It Qa Fine Tune | 32K / 14.5 GB | 3140 | 5 |
Dictalm2.0 Instruct Fine Tuned | 32K / 14.5 GB | 5826 | 0 |
... Fine Tuned Alpaca Gpt4 Hebrew | 32K / 14.5 GB | 4495 | 0 |
Misjava Api 060924 V3 Merged | 32K / 14.6 GB | 11 | 0 |
Quietstar 8 Ahead | 32K / 14.5 GB | 129 | 90 |
Swallow Hermes St V1 | 4K / 14.6 GB | 11 | 14 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐