๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐
Best Alternatives |
HF Rank |
Context/RAM |
Downloads |
Likes |
---|---|---|---|---|
Bitnet B1 58 3B Quantized | — | 2K / 1.1 GB | 88 | 12 |
Bitnet B1 58 3B Coder | — | 2K / 6.6 GB | 41 | 1 |
Bitnet 5B V1 | — | 2K / 9.2 GB | 4 | 1 |
Bitnet B1 58 Large Q8 0 Gguf | — | 2K / 0.8 GB | 198 | 0 |
LLM Name | Bitnet B1 58 3B |
Repository | Open on ๐ค |
Model Size | 3b |
Required VRAM | 13.3 GB |
Updated | 2024-07-01 |
Maintainer | 1bitLLM |
Model Type | llama |
Model Files | |
Model Architecture | BitnetForCausalLM |
License | mit |
Context Length | 2048 |
Model Max Length | 2048 |
Transformers Version | 4.39.0 |
Tokenizer Class | BitnetTokenizer |
Padding Token | <pad> |
Vocabulary Size | 32002 |
Initializer Range | 0.02 |
Torch Data Type | float16 |