30B Epsilon by CalderaAI

 ยป  All LLMs  ยป  CalderaAI  ยป  30B Epsilon   URL Share it on

  Adventure   Alpaca   Autotrain compatible   Cot   Endpoints compatible   Hippogriff   Instruct   Llama   Manticore   Merge   Mix   Pytorch   Region:us   Roleplay   Rp   Sharded   Story   Supercot   Superhot   Uncensored   Vicuna   Wizardlm
Model Card on HF ๐Ÿค—: https://huggingface.co/CalderaAI/30B-Epsilon 

30B Epsilon Benchmarks

30B Epsilon (CalderaAI/30B-Epsilon)

30B Epsilon Parameters and Internals

Model Type 
instruct, general purpose, uncensored
Use Cases 
Areas:
text generation, adventure roleplay, storytelling
Primary Use Cases:
text based adventure game, creative storytelling
Additional Notes 
Model assembled from handpicked models and LoRAs. Allows personalized instruction following through contextual memory input in specialized UIs.
Training Details 
Methodology:
Experimental use of LoRAs on language models and model merges.
Input Output 
Input Format:
Alpaca format
Accepted Modalities:
text
LLM Name30B Epsilon
Repository ๐Ÿค—https://huggingface.co/CalderaAI/30B-Epsilon 
Model Size30b
Required VRAM65.2 GB
Updated2025-02-23
MaintainerCalderaAI
Model Typellama
Model Files  16.9 GB   9.8 GB: 1-of-7   10.0 GB: 2-of-7   9.9 GB: 3-of-7   9.9 GB: 4-of-7   9.9 GB: 5-of-7   10.0 GB: 6-of-7   5.7 GB: 7-of-7
Model ArchitectureLlamaForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.28.1
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Quantized Models of the 30B Epsilon

Model
Likes
Downloads
VRAM
30B Epsilon GGUF535513 GB
30B Epsilon AWQ31717 GB
30B Epsilon GPTQ64016 GB

Best Alternatives to 30B Epsilon

Best Alternatives
Context / RAM
Downloads
Likes
Flash Llama 30M 2000132K / 0.1 GB20770
Smaug Slerp 30B V0.132K / 60.4 GB180
Llama33b 16K16K / 65.2 GB181
Yayi2 30B Llama4K / 121.2 GB202022
... Tokens By Perplexity Bottom K4K / 5.4 GB910
...via Sample With Temperature2.04K / 5.4 GB650
...lue Sample With Temperature2.04K / 5.4 GB640
... Tokens By Writing Style Top K4K / 5.4 GB50
Yayi2 30B Guanaco4K / 60.6 GB102
Llama 30B Instruct 20482K / 65.2 GB3828102
Note: green Score (e.g. "73.2") means that the model is better than CalderaAI/30B-Epsilon.

Rank the 30B Epsilon Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43515 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227