Rwkv 6 World 3B V2.1 by RWKV

 ยป  All LLMs  ยป  RWKV  ยป  Rwkv 6 World 3B V2.1   URL Share it on

  Autotrain compatible   Custom code   Pytorch   Region:us   Rwkv6

Rwkv 6 World 3B V2.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Rwkv 6 World 3B V2.1 (RWKV/rwkv-6-world-3b-v2.1)

Rwkv 6 World 3B V2.1 Parameters and Internals

Model Type 
causal language model
Additional Notes 
RWKV is an advanced language model, similar to large transformer models, designed for generating coherent and contextually relevant text.
Input Output 
Input Format:
Instruction: <instruction> Input: <input> Response:
Accepted Modalities:
text
Output Format:
Assistant: <response>
LLM NameRwkv 6 World 3B V2.1
Repository ๐Ÿค—https://huggingface.co/RWKV/rwkv-6-world-3b-v2.1 
Model Size3b
Required VRAM6.2 GB
Updated2025-01-17
MaintainerRWKV
Model Typerwkv6
Model Files  6.2 GB
Model ArchitectureRwkv6ForCausalLM
Transformers Version4.34.0
Tokenizer ClassRwkv6Tokenizer
Vocabulary Size65536

Best Alternatives to Rwkv 6 World 3B V2.1

Best Alternatives
Context / RAM
Downloads
Likes
V6 Finch 3B HF0K / 6.2 GB4543
Rwkv 6 World 3B0K / 6.2 GB153
Rwkv0K / 0 GB730
Note: green Score (e.g. "73.2") means that the model is better than RWKV/rwkv-6-world-3b-v2.1.

Rank the Rwkv 6 World 3B V2.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227