Medicine Chat AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Medicine Chat AWQ   URL Share it on

  Arxiv:2309.09530   4-bit   Autotrain compatible   Awq Base model:adaptllm/medicine-c... Base model:quantized:adaptllm/...   Biology   Dataset:eleutherai/pile   Dataset:gair/lima   Dataset:open-orca/openorca Dataset:wizardlm/wizardlm evol...   En   Instruct   Llama   Medical   Quantized   Region:us   Safetensors

Medicine Chat AWQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Medicine Chat AWQ (TheBloke/medicine-chat-AWQ)

Medicine Chat AWQ Parameters and Internals

Model Type 
llama
Use Cases 
Areas:
biomedicine
Applications:
text-generation
Additional Notes 
The model is quantized using 4-bit AWQ.
Supported Languages 
en (high)
Training Details 
Data Sources:
EleutherAI/pile, Open-Orca/OpenOrca, GAIR/lima, WizardLM/WizardLM_evol_instruct_V2_196k
Methodology:
Reading Comprehension Adaptation
Context Length:
4096
Model Architecture:
Transformer-based
Input Output 
Input Format:
[INST] <> {system_message} <> {prompt} [/INST]
Accepted Modalities:
text
Release Notes 
Version:
Not Specified
Date:
2023-12-08
Notes:
Released AdaptLLM's Medicine Chat derived from LLaMa-2-Chat-7B.
LLM NameMedicine Chat AWQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/medicine-chat-AWQ 
Model NameMedicine Chat
Model CreatorAdaptLLM
Base Model(s)  Medicine Chat   AdaptLLM/medicine-chat
Model Size1.1b
Required VRAM3.9 GB
Updated2024-12-22
MaintainerTheBloke
Model Typellama
Instruction-BasedYes
Model Files  3.9 GB
Supported Languagesen
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Padding Token<pad>
Unk Token<unk>
Vocabulary Size32001
Torch Data Typefloat16

Best Alternatives to Medicine Chat AWQ

Best Alternatives
Context / RAM
Downloads
Likes
Finance Chat AWQ4K / 3.9 GB313
Law Chat AWQ4K / 3.9 GB173
AdaptLLM Finance Chat AWQ4K / 3.9 GB181
Medicine LLM AWQ2K / 3.9 GB383
Finance LLM AWQ2K / 3.9 GB185
AmberChat AWQ2K / 3.9 GB221
...1B 32K Instruct 8.0bpw H8 EXL232K / 1.2 GB61
...1B 32K Instruct 3.0bpw H6 EXL232K / 0.5 GB70
...nish English Asistant 16bit V22K / 2.2 GB130
TinyLlama 1.1B 32K Instruct32K / 2.2 GB54012
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/medicine-chat-AWQ.

Rank the Medicine Chat AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40123 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217