LLM Name | Unakar1.5B Base |
Repository ๐ค | https://huggingface.co/unakar/Unakar1.5B-base |
Base Model(s) | |
Model Size | 5b |
Required VRAM | 3 GB |
Updated | 2024-09-20 |
Maintainer | unakar |
Model Type | internlm2 |
Model Files | |
Supported Languages | en |
Model Architecture | InternLM2ForCausalLM |
Context Length | 16384 |
Model Max Length | 16384 |
Transformers Version | 4.25.1 |
Is Biased | 0 |
Tokenizer Class | InternLM2Tokenizer |
Padding Token | </s> |
Vocabulary Size | 92544 |
Torch Data Type | bfloat16 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐