Pathumma Llm Text 1.0.0 by nectec

 ยป  All LLMs  ยป  nectec  ยป  Pathumma Llm Text 1.0.0   URL Share it on

  Arxiv:2407.10671 Base model:finetune:nectec/ope... Base model:nectec/openthaillm-...   Biology   Chemistry   Code   Conversational   En   Endpoints compatible   Finance   Gguf   Legal   Medical   Pytorch   Q4   Quantized   Qwen2   Region:us   Safetensors   Sharded   Tensorflow   Th   Zh

Pathumma Llm Text 1.0.0 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pathumma Llm Text 1.0.0 (nectec/Pathumma-llm-text-1.0.0)

Pathumma Llm Text 1.0.0 Parameters and Internals

LLM NamePathumma Llm Text 1.0.0
Repository ๐Ÿค—https://huggingface.co/nectec/Pathumma-llm-text-1.0.0 
Base Model(s)  OpenThaiLLM Prebuilt 7B   nectec/OpenThaiLLM-Prebuilt-7B
Model Size7b
Required VRAM30.5 GB
Updated2024-12-26
Maintainernectec
Model Typeqwen2
Model Files  4.7 GB   5.0 GB: 1-of-7   4.8 GB: 2-of-7   4.9 GB: 3-of-7   4.9 GB: 4-of-7   5.0 GB: 5-of-7   3.7 GB: 6-of-7   2.2 GB: 7-of-7   30.5 GB
Supported Languagesth zh en
GGUF QuantizationYes
Quantization Typeq4|gguf|q4_k
Model ArchitectureQwen2ForCausalLM
Licenseapache-2.0
Context Length131072
Model Max Length131072
Transformers Version4.43.1
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size152064
Torch Data Typebfloat16
Errorsreplace

Best Alternatives to Pathumma Llm Text 1.0.0

Best Alternatives
Context / RAM
Downloads
Likes
SvelteCodeQwen1.5 7B Chat64K / 14.5 GB4600
CodeQwen1.5 7B Chat GGUF64K / 3 GB1902
Qwen2 Cantonese 7B Instruct32K / 15.4 GB2393
Qwen2 7B Instruct GGUF32K / 3 GB1351
Qwen2 7B Instruct GGUF32K / 3 GB500
Qwen1.5 7B Chat GGUF32K / 3.1 GB1101
A1 V002128K / 15.2 GB2010
Qwen2.5 7B Bnb 4bit128K / 5.5 GB463121
A1 V0.0.1128K / 15.2 GB560
Krx Q25 7B Base V3.3128K / 15.2 GB3540
Note: green Score (e.g. "73.2") means that the model is better than nectec/Pathumma-llm-text-1.0.0.

Rank the Pathumma Llm Text 1.0.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40248 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217