ArcaneEntanglement Model64 70B by ChuckMcSneed

 ยป  All LLMs  ยป  ChuckMcSneed  ยป  ArcaneEntanglement Model64 70B   URL Share it on

  Merged Model   Arxiv:2203.05482   Autotrain compatible   Endpoints compatible   License:llama2   Llama   Region:us   Safetensors   Sharded   Tensorflow

ArcaneEntanglement Model64 70B Benchmarks

Rank the ArcaneEntanglement Model64 70B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
ArcaneEntanglement Model64 70B (ChuckMcSneed/ArcaneEntanglement-model64-70b)

Best Alternatives to ArcaneEntanglement Model64 70B

Best Alternatives
HF Rank
MultiVerse 70B80.9832K / 144.5 GB197715
XuanYuan 70B79.558K / 138.3 GB130844
Tigerbot 70B Chat V277.924K / 139.2 GB62648
QuartetAnemoi 70B T0.000176.8631K / 137.8 GB263929
Miiqu F1676.7731K / 180.7 GB218411
Miqu MS 70B76.7431K / 138 GB13486
Miqu 70B Alpaca DPO76.631K / 138.7 GB38006
Miqu 1 70B Sf76.5931K / 138.7 GB21544214
BoreanGale 70B76.4832K / 137.8 GB25094
Tigerbot 70B Chat V675.958K / 139.8 GB201
Note: green Score (e.g. "73.2") means that the model is better than ChuckMcSneed/ArcaneEntanglement-model64-70b.

ArcaneEntanglement Model64 70B Parameters and Internals

LLM NameArcaneEntanglement Model64 70B
RepositoryOpen on ๐Ÿค— 
Merged ModelYes
Model Size70b
Required VRAM138 GB
Model Typellama
Model Files  9.6 GB: 1-of-15   9.8 GB: 2-of-15   9.6 GB: 3-of-15   9.8 GB: 4-of-15   9.9 GB: 5-of-15   9.7 GB: 6-of-15   10.0 GB: 7-of-15   9.9 GB: 8-of-15   9.7 GB: 9-of-15   9.6 GB: 10-of-15   10.0 GB: 11-of-15   9.8 GB: 12-of-15   9.6 GB: 13-of-15   9.9 GB: 14-of-15   1.1 GB: 15-of-15
Model ArchitectureLlamaForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.38.1
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Initializer Range0.02
Torch Data Typefloat16

What open-source LLMs or SLMs are you in search of? 35526 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20240042001