WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K by Panchovix

 ยป  All LLMs  ยป  Panchovix  ยป  WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K   URL Share it on

  Autotrain compatible   Endpoints compatible   Ext 8k   Llama   Pytorch   Region:us   Sharded   Uncensored

WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K Parameters and Internals

LLM NameWizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K
Repository ๐Ÿค—https://huggingface.co/Panchovix/WizardLM-Uncensored-SuperCOT-StoryTelling-30b-SuperHOT-8k 
Model Size30b
Required VRAM65.2 GB
Updated2024-09-19
MaintainerPanchovix
Model Typellama
Model Files  9.8 GB: 1-of-7   10.0 GB: 2-of-7   9.9 GB: 3-of-7   9.9 GB: 4-of-7   9.9 GB: 5-of-7   10.0 GB: 6-of-7   5.7 GB: 7-of-7
Context Length8k
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length2048
Model Max Length2048
Transformers Version4.30.2
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32001
Torch Data Typefloat16
WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K (Panchovix/WizardLM-Uncensored-SuperCOT-StoryTelling-30b-SuperHOT-8k)

Quantized Models of the WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K

Model
Likes
Downloads
VRAM
...ryTelling 30B SuperHOT 8K GPTQ472616 GB
...ryTelling 30B SuperHOT 8K Fp1681265 GB
...lling 30B SuperHOT 8K 4bit 32g1719 GB

Best Alternatives to WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K

Best Alternatives
Context / RAM
Downloads
Likes
...nsored Instruct PL Lora Unload2K / 65.2 GB6910
Wizard Vicuna 30B Uncensored2K / 130.5 GB2377138
Vicuzard 30B Uncensored2K / 64.9 GB73411
...ored SuperCOT StoryTelling 30B2K / 65.2 GB139745
WizardLM 30B Uncensored2K / 65.2 GB735138
...ncensored Guanaco SuperCOT 30B2K / 65.2 GB69624
...ardLM OpenAssistant 30B Native2K / 64.1 GB6840
Flash Llama 30M 2000132K / 0.1 GB7020
Smaug Slerp 30B V0.132K / 60.4 GB610
Llama33b 16K16K / 65.2 GB61
Note: green Score (e.g. "73.2") means that the model is better than Panchovix/WizardLM-Uncensored-SuperCOT-StoryTelling-30b-SuperHOT-8k.

Rank the WizardLM Uncensored SuperCOT StoryTelling 30B SuperHOT 8K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36073 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803