LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Law LLM 13B by AdaptLLM

What open-source LLMs or SLMs are you in search of? 18857 in total.

 ยป  All LLMs  ยป  AdaptLLM  ยป  Law LLM 13B   URL Share it on

  Arxiv:2309.09530   Autotrain compatible   Dataset:eleutherai/pile   Dataset:gair/lima   Dataset:open-orca/openorca Dataset:wizardlm/wizardlm evol...   En   Endpoints compatible   Has space   Instruct   Legal   Llama   Pytorch   Region:us   Sharded

Rank the Law LLM 13B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Law LLM 13B (AdaptLLM/law-LLM-13B)

Quantized Models of the Law LLM 13B

Model
Likes
Downloads
VRAM
Law LLM 13B GGUF5895 GB
Law LLM 13B GPTQ247 GB
Law LLM 13B AWQ237 GB

Best Alternatives to Law LLM 13B

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Solarized 13B DPO62.054K / 24.9 GB13061
Speechless Llama2 13B61.364K / 26.7 GB24504
Trurl 2 13B Pl Instruct Unload58.444K / 26 GB28056
GenAI Llama 2 13B58.174K / 26 GB39134
...struct Llama2 Koen 13B V0.9.2456.982K / 26.3 GB33880
SOLAR 13B Instruct V1.056.654K / 25 GB13481
Mythalion 13B56.484K / 26 GB5933119
...ga 13B Instruct PL Lora Unload56.244K / 26 GB27791
Model 007 13b V255.414K / 26 GB8434
Vicuna 13B V1.5 PL Lora Unload55.244K / 26 GB27881
Note: green Score (e.g. "73.2") means that the model is better than AdaptLLM/law-LLM-13B.

Law LLM 13B Parameters and Internals

LLM NameLaw LLM 13B
RepositoryOpen on ๐Ÿค— 
Model Size13b
Required VRAM52.1 GB
Updated2024-02-28
MaintainerAdaptLLM
Model Typellama
Instruction-BasedYes
Model Files  10.0 GB: 1-of-6   9.9 GB: 2-of-6   9.9 GB: 3-of-6   9.9 GB: 4-of-6   9.9 GB: 5-of-6   2.5 GB: 6-of-6
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.28.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Padding Token<pad>
Unk Token<unk>
Vocabulary Size32001
Initializer Range0.02
Torch Data Typefloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003