L3 Aethora 15B V2 EXL2 8.0bpw by bullerwins

 ยป  All LLMs  ยป  bullerwins  ยป  L3 Aethora 15B V2 EXL2 8.0bpw   URL Share it on

  8-bit   Autotrain compatible Base model:elinas/llama-3-15b-... Base model:quantized:elinas/ll...   Conversational Dataset:theskullery/aether-lit...   En   Endpoints compatible   Exl2   Instruct   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow

L3 Aethora 15B V2 EXL2 8.0bpw Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

L3 Aethora 15B V2 EXL2 8.0bpw Parameters and Internals

LLM NameL3 Aethora 15B V2 EXL2 8.0bpw
Repository ๐Ÿค—https://huggingface.co/bullerwins/L3-Aethora-15B-V2-exl2_8.0bpw 
Base Model(s)  Llama 3 15B Instruct Zeroed   elinas/Llama-3-15B-Instruct-zeroed
Model Size15b
Required VRAM12.4 GB
Updated2024-09-18
Maintainerbullerwins
Model Typellama
Instruction-BasedYes
Model Files  8.6 GB: 1-of-2   3.8 GB: 2-of-2
Supported Languagesen
Quantization Typeexl2
Model ArchitectureLlamaForCausalLM
Licensecc-by-sa-4.0
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16
L3 Aethora 15B V2 EXL2 8.0bpw (bullerwins/L3-Aethora-15B-V2-exl2_8.0bpw)

Best Alternatives to L3 Aethora 15B V2 EXL2 8.0bpw

Best Alternatives
Context / RAM
Downloads
Likes
L3 Aethora 15B V2 EXL2 4.0bpw8K / 8.4 GB21
L3 Aethora 15B V2 EXL2 5.0bpw8K / 10.2 GB21
L3 Aethora 15B V2 EXL2 6.0bpw8K / 11.9 GB21
L3 Aethora 15B V28K / 30.1 GB5039
OpenCrystal 15B L3 V38K / 30 GB606
OpenCrystal L3 15B V2.18K / 30 GB245
OpenCrystal 15B L3 V28K / 30 GB374
Meta Llama 3 15B Instruct8K / 23.9 GB21
Llama 3 15B Instruct Ft V2 AWQ8K / 9.4 GB50
Llama 3 15B Instruct Ft V28K / 30.1 GB44
Note: green Score (e.g. "73.2") means that the model is better than bullerwins/L3-Aethora-15B-V2-exl2_8.0bpw.

Rank the L3 Aethora 15B V2 EXL2 8.0bpw Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36026 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803