L3 Aethora 15B V2 EXL2 4.0bpw by bullerwins

 ยป  All LLMs  ยป  bullerwins  ยป  L3 Aethora 15B V2 EXL2 4.0bpw   URL Share it on

  4-bit   Autotrain compatible Base model:elinas/llama-3-15b-... Base model:quantized:elinas/ll...   Conversational Dataset:theskullery/aether-lit...   En   Endpoints compatible   Exl2   Instruct   Llama   Quantized   Region:us   Safetensors

L3 Aethora 15B V2 EXL2 4.0bpw Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
L3 Aethora 15B V2 EXL2 4.0bpw (bullerwins/L3-Aethora-15B-V2-exl2_4.0bpw)

L3 Aethora 15B V2 EXL2 4.0bpw Parameters and Internals

Model Type 
text generation, instruction following
Use Cases 
Primary Use Cases:
Creative Writing and Storytelling, General Intelligence, Instructional and Educational Content, Reasoning and Problem-Solving, Contextual Understanding and Adaptability
Additional Notes 
The model employs state-of-the-art training techniques and a curated dataset to deliver enhanced performance across a wide range of tasks.
Training Details 
Data Sources:
TheSkullery/Aether-Lite-v1.8.1
Data Volume:
125k samples
Methodology:
LoRA (Low-Rank Adaptation)
Context Length:
8192
Training Time:
17.5 hours
Hardware Used:
4 x A100 GPUs
Model Architecture:
Llama 3-based
LLM NameL3 Aethora 15B V2 EXL2 4.0bpw
Repository ๐Ÿค—https://huggingface.co/bullerwins/L3-Aethora-15B-V2-exl2_4.0bpw 
Base Model(s)  Llama 3 15B Instruct Zeroed   elinas/Llama-3-15B-Instruct-zeroed
Model Size15b
Required VRAM8.4 GB
Updated2024-12-21
Maintainerbullerwins
Model Typellama
Instruction-BasedYes
Model Files  8.4 GB
Supported Languagesen
Quantization Typeexl2
Model ArchitectureLlamaForCausalLM
Licensecc-by-sa-4.0
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to L3 Aethora 15B V2 EXL2 4.0bpw

Best Alternatives
Context / RAM
Downloads
Likes
L3 Aethora 15B V2 EXL2 5.0bpw8K / 10.2 GB121
L3 Aethora 15B V2 EXL2 6.0bpw8K / 11.9 GB91
L3 Aethora 15B V2 EXL2 8.0bpw8K / 12.4 GB31
L3 Aethora 15B V28K / 30.1 GB11040
OpenCrystal 15B L3 V38K / 30 GB1789
OpenCrystal 15B L3 V28K / 30 GB285
OpenCrystal L3 15B V2.18K / 30 GB115
Meta Llama 3 15B Instruct8K / 23.9 GB101
Llama 3 15B Instruct Ft V28K / 30.1 GB944
Llama 3 15B Instruct Zeroed Ft8K / 30.1 GB972
Note: green Score (e.g. "73.2") means that the model is better than bullerwins/L3-Aethora-15B-V2-exl2_4.0bpw.

Rank the L3 Aethora 15B V2 EXL2 4.0bpw Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217