Aetheria LongLORA 70B Rope8 32K Fp16 by grimulkan

 ยป  All LLMs  ยป  grimulkan  ยป  Aetheria LongLORA 70B Rope8 32K Fp16   URL Share it on

  Autotrain compatible   Endpoints compatible   Fp16   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Aetheria LongLORA 70B Rope8 32K Fp16 Parameters and Internals

LLM NameAetheria LongLORA 70B Rope8 32K Fp16
RepositoryOpen on ๐Ÿค— 
Model Size70b
Required VRAM138.3 GB
Updated2024-07-27
Maintainergrimulkan
Model Typellama
Model Files  4.1 GB: 1-of-35   4.1 GB: 2-of-35   3.9 GB: 3-of-35   4.1 GB: 4-of-35   4.0 GB: 5-of-35   3.9 GB: 6-of-35   4.1 GB: 7-of-35   4.0 GB: 8-of-35   3.9 GB: 9-of-35   4.1 GB: 10-of-35   4.0 GB: 11-of-35   3.9 GB: 12-of-35   4.1 GB: 13-of-35   4.0 GB: 14-of-35   3.9 GB: 15-of-35   4.1 GB: 16-of-35   4.0 GB: 17-of-35   3.9 GB: 18-of-35   4.1 GB: 19-of-35   4.0 GB: 20-of-35   3.9 GB: 21-of-35   4.1 GB: 22-of-35   4.0 GB: 23-of-35   3.9 GB: 24-of-35   4.1 GB: 25-of-35   4.0 GB: 26-of-35   3.9 GB: 27-of-35   4.1 GB: 28-of-35   4.0 GB: 29-of-35   3.9 GB: 30-of-35   4.1 GB: 31-of-35   4.0 GB: 32-of-35   3.9 GB: 33-of-35   4.1 GB: 34-of-35   2.1 GB: 35-of-35
Quantization Typefp16
Model ArchitectureLlamaForCausalLM
Licenseunknown
Context Length4096
Model Max Length4096
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Torch Data Typefloat16
Aetheria LongLORA 70B Rope8 32K Fp16 (grimulkan/Aetheria-longLORA-70b-rope8-32k-fp16)

Best Alternatives to Aetheria LongLORA 70B Rope8 32K Fp16

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...B Instruct Gradient 1048K 8bit0.21024K / 75 GB165
...B Instruct Gradient 1048K 4bit0.21024K / 39.7 GB123
...B Instruct Gradient 1048K 2bit0.21024K / 21.9 GB202
...0B Instruct Gradient 262K 4bit0.2256K / 39.7 GB73
...0B Instruct Gradient 262K 8bit0.2256K / 75 GB72
...0B Instruct Gradient 262K 2bit0.2256K / 21.9 GB121
... Gradient 262K 2.25bpw H6 EXL20.2256K / 22.2 GB170
...t Gradient 262K 2.4bpw H6 EXL20.2256K / 23.5 GB180
...t Gradient 262K 4.0bpw H6 EXL20.2256K / 37.2 GB81
...t Gradient 262K 3.5bpw H6 EXL20.2256K / 32.9 GB80
Note: green Score (e.g. "73.2") means that the model is better than grimulkan/Aetheria-longLORA-70b-rope8-32k-fp16.

Rank the Aetheria LongLORA 70B Rope8 32K Fp16 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34447 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072501