LLM Name | Distilled Chat Ocra |
Repository ๐ค | https://huggingface.co/Tejasvisudugureddy/distilled_chat_ocra |
Model Size | 134.5m |
Required VRAM | 0.5 GB |
Updated | 2025-02-22 |
Maintainer | Tejasvisudugureddy |
Model Type | llama |
Model Files | |
Model Architecture | LlamaForCausalLM |
Context Length | 8192 |
Model Max Length | 8192 |
Transformers Version | 4.45.2 |
Tokenizer Class | GPT2Tokenizer |
Padding Token | <|im_end|> |
Vocabulary Size | 49152 |
Torch Data Type | float32 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Turn Detector | 8K / 0.5 GB | 4871 | 28 |
SmolLM FT CoEdIT | 2K / 0.5 GB | 158 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐