Law LLM by AdaptLLM

 ยป  All LLMs  ยป  AdaptLLM  ยป  Law LLM   URL Share it on

  Arxiv:2309.09530   Arxiv:2406.14491   Arxiv:2411.19930   Autotrain compatible   Dataset:eleutherai/pile   Dataset:gair/lima   Dataset:open-orca/openorca Dataset:wizardlm/wizardlm evol...   En   Endpoints compatible   Instruct   Legal   Llama   Pytorch   Region:us   Safetensors   Sharded
Model Card on HF ๐Ÿค—: https://huggingface.co/AdaptLLM/law-LLM 

Law LLM Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Law LLM (AdaptLLM/law-LLM)

Law LLM Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
biomedicine, finance, law
Applications:
research in biomedicine, financial analytics, legal analysis
Primary Use Cases:
text generation for domain-specific tasks
Limitations:
Effectiveness may decrease with non-domain-specific questions
Additional Notes 
Reading comprehension method enhances domain knowledge while maintaining QA ability
Training Details 
Data Sources:
Open-Orca/OpenOrca, GAIR/lima, WizardLM/WizardLM_evol_instruct_V2_196k, EleutherAI/pile
Methodology:
Continual pre-training on domain-specific corpora via reading comprehension methods
Context Length:
2048
Input Output 
Input Format:
Text input in the form of questions
Accepted Modalities:
text
Output Format:
Generated text responses
Performance Tips:
Use specific reading comprehension formatted prompts for best results
Release Notes 
Version:
2nd version
Date:
2024/6/21
Notes:
Release of AdaptLLM at Instruction-Pretrain
LLM NameLaw LLM
Repository ๐Ÿค—https://huggingface.co/AdaptLLM/law-LLM 
Model Size6.7b
Required VRAM26.1 GB
Updated2024-12-14
MaintainerAdaptLLM
Model Typellama
Instruction-BasedYes
Model Files  0.8 GB: 1-of-33   0.8 GB: 2-of-33   0.8 GB: 3-of-33   0.8 GB: 4-of-33   0.8 GB: 5-of-33   0.8 GB: 6-of-33   0.8 GB: 7-of-33   0.8 GB: 8-of-33   0.8 GB: 9-of-33   0.8 GB: 10-of-33   0.8 GB: 11-of-33   0.8 GB: 12-of-33   0.8 GB: 13-of-33   0.8 GB: 14-of-33   0.8 GB: 15-of-33   0.8 GB: 16-of-33   0.8 GB: 17-of-33   0.8 GB: 18-of-33   0.8 GB: 19-of-33   0.8 GB: 20-of-33   0.8 GB: 21-of-33   0.8 GB: 22-of-33   0.8 GB: 23-of-33   0.8 GB: 24-of-33   0.8 GB: 25-of-33   0.8 GB: 26-of-33   0.8 GB: 27-of-33   0.8 GB: 28-of-33   0.8 GB: 29-of-33   0.8 GB: 30-of-33   0.8 GB: 31-of-33   0.8 GB: 32-of-33   0.5 GB: 33-of-33
Supported Languagesen
Model ArchitectureLLaMAForCausalLM
Model Max Length2048
Transformers Version4.27.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<pad>
Vocabulary Size32001
Torch Data Typefloat16

Quantized Models of the Law LLM

Model
Likes
Downloads
VRAM
Law LLM GGUF166312 GB
Law LLM GPTQ3343 GB
Law LLM AWQ2253 GB

Best Alternatives to Law LLM

Best Alternatives
Context / RAM
Downloads
Likes
Finance LLM0K / 26.1 GB359109
Medicine LLM0K / 8.8 GB21637
Note: green Score (e.g. "73.2") means that the model is better than AdaptLLM/law-LLM.

Rank the Law LLM Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 39237 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124