LLM Name | V6 Finch 3B HF |
Repository ๐ค | https://huggingface.co/RWKV/v6-Finch-3B-HF |
Model Size | 3b |
Required VRAM | 6.2 GB |
Updated | 2024-09-08 |
Maintainer | RWKV |
Model Type | rwkv6 |
Model Files | |
Model Architecture | Rwkv6ForCausalLM |
License | apache-2.0 |
Transformers Version | 4.34.0 |
Tokenizer Class | Rwkv6Tokenizer |
Vocabulary Size | 65536 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Rwkv 6 World 3B V2.1 | 0K / 6.2 GB | 103 | 0 |
Rwkv 6 World 3B | 0K / 6.2 GB | 10 | 3 |
Rwkv | 0K / 0 GB | 56 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐