Openorca Stx by lightblue

 ยป  All LLMs  ยป  lightblue  ยป  Openorca Stx   URL Share it on

  Autotrain compatible   Dataset:csebuetnlp/xlsum   Dataset:khalidalt/tydiqa-goldp Dataset:snow simplified japane...   Endpoints compatible   Ja   Llama   Pytorch   Region:us   Sharded
Model Card on HF ๐Ÿค—: https://huggingface.co/lightblue/openorca_stx 

Openorca Stx Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Openorca Stx (lightblue/openorca_stx)

Openorca Stx Parameters and Internals

Model Type 
Closed Question Answering
Use Cases 
Areas:
Japanese NLP tasks
Applications:
Closed Question Answering in Japanese
Additional Notes 
Model effectively deals with Japanese Closed Question Answering tasks using QLoRA fine-tuning with specific datasets.
Supported Languages 
ja (proficient)
Training Details 
Data Sources:
snow_simplified_japanese_corpus, khalidalt/tydiqa-goldp, csebuetnlp/xlsum
Data Volume:
13,167 samples
Methodology:
QLoRA finetuning on three specific Japanese datasets
LLM NameOpenorca Stx
Repository ๐Ÿค—https://huggingface.co/lightblue/openorca_stx 
Required VRAM26 GB
Updated2025-02-23
Maintainerlightblue
Model Typellama
Model Files  9.9 GB: 1-of-3   9.9 GB: 2-of-3   6.2 GB: 3-of-3
Supported Languagesja
Model ArchitectureLlamaForCausalLM
Licensellama2
Context Length4096
Model Max Length4096
Transformers Version4.33.1
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32002
Torch Data Typebfloat16

Quantized Models of the Openorca Stx

Model
Likes
Downloads
VRAM
OpenOrca Stx GGUF43595 GB
OpenOrca Stx AWQ1937 GB
OpenOrca Stx GPTQ1367 GB

Best Alternatives to Openorca Stx

Best Alternatives
Context / RAM
Downloads
Likes
LWM Text 512K512K / 13.5 GB102
LWM Text Chat 512K512K / 13.5 GB62
LWM Text 256K256K / 13.5 GB243
LWM Text Chat 256K256K / 13.5 GB213
Pallas 0.5 LASER 0.1195K / 68.9 GB20332
Ashley3b X 1.2128K / 6.5 GB250
Ashley3b X 1.3128K / 6.5 GB140
Cyber13128K / 16.1 GB50
Cyber8128K / 16.1 GB50
LWM Text Chat 128K128K / 13.5 GB3120
Note: green Score (e.g. "73.2") means that the model is better than lightblue/openorca_stx.

Rank the Openorca Stx Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227