WizardLM Uncensored SuperCOT StoryTelling 30B by Monero

 ยป  All LLMs  ยป  Monero  ยป  WizardLM Uncensored SuperCOT StoryTelling 30B   URL Share it on

  Autotrain compatible   Endpoints compatible   Llama   Pytorch   Region:us   Sharded   Uncensored

WizardLM Uncensored SuperCOT StoryTelling 30B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

WizardLM Uncensored SuperCOT StoryTelling 30B Parameters and Internals

LLM NameWizardLM Uncensored SuperCOT StoryTelling 30B
Repository ๐Ÿค—https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b 
Model Size30b
Required VRAM65.2 GB
Updated2024-09-19
MaintainerMonero
Model Typellama
Model Files  4.0 GB: 1-of-17   3.9 GB: 2-of-17   3.8 GB: 3-of-17   4.0 GB: 4-of-17   3.8 GB: 5-of-17   3.8 GB: 6-of-17   3.9 GB: 7-of-17   3.8 GB: 8-of-17   4.0 GB: 9-of-17   3.8 GB: 10-of-17   3.8 GB: 11-of-17   3.9 GB: 12-of-17   3.8 GB: 13-of-17   4.0 GB: 14-of-17   3.8 GB: 15-of-17   3.8 GB: 16-of-17   3.3 GB: 17-of-17   0.0 GB
Model ArchitectureLlamaForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.28.0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32001
Torch Data Typefloat16
WizardLM Uncensored SuperCOT StoryTelling 30B (Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b)

Quantized Models of the WizardLM Uncensored SuperCOT StoryTelling 30B

Model
Likes
Downloads
VRAM
...SuperCOT StoryTelling 30B GGUF25747713 GB
...SuperCOT StoryTelling 30B GPTQ848616 GB
... SuperCOT StoryTelling 30B AWQ52817 GB

Best Alternatives to WizardLM Uncensored SuperCOT StoryTelling 30B

Best Alternatives
Context / RAM
Downloads
Likes
...nsored Instruct PL Lora Unload2K / 65.2 GB6910
Wizard Vicuna 30B Uncensored2K / 130.5 GB2377138
Vicuzard 30B Uncensored2K / 64.9 GB73411
WizardLM 30B Uncensored2K / 65.2 GB735138
...ncensored Guanaco SuperCOT 30B2K / 65.2 GB69624
...ardLM OpenAssistant 30B Native2K / 64.1 GB6840
...T StoryTelling 30B SuperHOT 8K2K / 65.2 GB62
Flash Llama 30M 2000132K / 0.1 GB7020
Smaug Slerp 30B V0.132K / 60.4 GB610
Llama33b 16K16K / 65.2 GB61
Note: green Score (e.g. "73.2") means that the model is better than Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b.

Rank the WizardLM Uncensored SuperCOT StoryTelling 30B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36073 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803