Etheria 55B V0.1 by Steelskull

 ยป  All LLMs  ยป  Steelskull  ยป  Etheria 55B V0.1   URL Share it on

  Merged Model   Arxiv:2306.01708   Arxiv:2311.03099   Autotrain compatible   Endpoints compatible   Etheria   Llama   Model-index   Region:us   Safetensors   Sharded   Tensorflow

Etheria 55B V0.1 Benchmarks

Etheria 55B V0.1 (Steelskull/Etheria-55b-v0.1)

Etheria 55B V0.1 Parameters and Internals

Model Type 
text-generation
Additional Notes 
An attempt to make a functional goliath style merge to create Etheria 55b-200k with two yi-34b-200k models. Due to the merge, it 'theoretically' should have a context of 200k, but starting at 32k is recommended.
Input Output 
Input Format:
ChatML & Alpaca
LLM NameEtheria 55B V0.1
Repository ๐Ÿค—https://huggingface.co/Steelskull/Etheria-55b-v0.1 
Merged ModelYes
Model Size55b
Required VRAM111.2 GB
Updated2024-10-16
MaintainerSteelskull
Model Typellama
Model Files  9.8 GB: 1-of-12   9.8 GB: 2-of-12   9.8 GB: 3-of-12   9.9 GB: 4-of-12   10.0 GB: 5-of-12   9.9 GB: 6-of-12   9.8 GB: 7-of-12   9.8 GB: 8-of-12   9.9 GB: 9-of-12   9.9 GB: 10-of-12   9.9 GB: 11-of-12   2.7 GB: 12-of-12
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length200000
Model Max Length200000
Transformers Version4.37.1
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size64002
Torch Data Typebfloat16

Quantized Models of the Etheria 55B V0.1

Model
Likes
Downloads
VRAM
Etheria 55B V0.1 GGUF825520 GB
Etheria 55B V0.1 GPTQ42629 GB
Etheria 55B V0.1 AWQ11430 GB

Best Alternatives to Etheria 55B V0.1

Best Alternatives
Context / RAM
Downloads
Likes
Etheria 55B V0.1195K / 111.2 GB2010
SG Raccoon Yi 55B 200K 2.0195K / 111.3 GB326
SG Raccoon Yi 55B 200K195K / 111.4 GB534
SG Raccoon Yi 55B4K / 111.2 GB566
...theria 55B V0.1 3.0bpw H6 EXL2195K / 21.9 GB42
...theria 55B V0.1 3.5bpw H6 EXL2195K / 25.3 GB41
Etheria 55B V0.1 GPTQ195K / 29.2 GB264
Etheria 55B V0.1 AWQ195K / 30.2 GB141
SG Raccoon Yi 55B GPTQ4K / 29.2 GB281
SG Raccoon Yi 55B AWQ4K / 30.2 GB112

Rank the Etheria 55B V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227