Goliath LongLORA 120B Rope8 32K Fp16 by grimulkan

 ยป  All LLMs  ยป  grimulkan  ยป  Goliath LongLORA 120B Rope8 32K Fp16   URL Share it on

  Autotrain compatible   Endpoints compatible   Fp16   Llama   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Goliath LongLORA 120B Rope8 32K Fp16 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Goliath LongLORA 120B Rope8 32K Fp16 (grimulkan/Goliath-longLORA-120b-rope8-32k-fp16)

Goliath LongLORA 120B Rope8 32K Fp16 Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
Research, Commercial Applications
Primary Use Cases:
Extended context length text generation
Additional Notes 
The resulting model is a combination, not broken, and can be tested for original model capability with extended context length using linear rope scaling 8.
Training Details 
Data Sources:
Interleaved merge of existing models
Methodology:
Interleaved Merge
Context Length:
32000
Model Architecture:
longLORA
Input Output 
Accepted Modalities:
text
LLM NameGoliath LongLORA 120B Rope8 32K Fp16
Repository ๐Ÿค—https://huggingface.co/grimulkan/Goliath-longLORA-120b-rope8-32k-fp16 
Model Size120b
Required VRAM235.4 GB
Updated2024-12-26
Maintainergrimulkan
Model Typellama
Model Files  9.6 GB: 1-of-25   9.8 GB: 2-of-25   9.9 GB: 3-of-25   9.8 GB: 4-of-25   9.6 GB: 5-of-25   9.7 GB: 6-of-25   9.8 GB: 7-of-25   9.9 GB: 8-of-25   9.7 GB: 9-of-25   9.9 GB: 10-of-25   10.0 GB: 11-of-25   9.9 GB: 12-of-25   10.0 GB: 13-of-25   9.8 GB: 14-of-25   10.0 GB: 15-of-25   9.8 GB: 16-of-25   9.8 GB: 17-of-25   9.8 GB: 18-of-25   9.9 GB: 19-of-25   9.8 GB: 20-of-25   9.6 GB: 21-of-25   9.7 GB: 22-of-25   9.6 GB: 23-of-25   9.9 GB: 24-of-25   0.1 GB: 25-of-25
Quantization Typefp16
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Quantized Models of the Goliath LongLORA 120B Rope8 32K Fp16

Model
Likes
Downloads
VRAM
...gLORA 120B Rope8 32K Fp16 GGUF97543 GB
...gLORA 120B Rope8 32K Fp16 GPTQ42459 GB
...ngLORA 120B Rope8 32K Fp16 AWQ42261 GB

Best Alternatives to Goliath LongLORA 120B Rope8 32K Fp16

Best Alternatives
Context / RAM
Downloads
Likes
Miquella 120B 3.0bpw H6 EXL231K / 44.8 GB1010
Miquella 120B 8.0bpw H8 EXL231K / 118.1 GB113
Miquella 120B 4.0bpw H6 EXL231K / 59.4 GB112
...t 120B Cat A Llama EXL2 5.5bpw8K / 85.3 GB150
...t 120B Cat A Llama EXL2 4.5bpw8K / 70.3 GB81
...RA 120B Rope8 32K 6bpw H8 EXL24K / 88.7 GB101
...egaDolphin 120B 2.9bpw H6 EXL24K / 44.3 GB183
...gaDolphin 120B 2.65bpw H6 EXL24K / 40.5 GB192
...egaDolphin 120B 4.0bpw H6 EXL24K / 60.8 GB171
Goliath 120B 6.0bpw H6 EXL24K / 88.9 GB161
Note: green Score (e.g. "73.2") means that the model is better than grimulkan/Goliath-longLORA-120b-rope8-32k-fp16.

Rank the Goliath LongLORA 120B Rope8 32K Fp16 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40248 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217