Orca Mini 13B by pankajmathur

 ยป  All LLMs  ยป  pankajmathur  ยป  Orca Mini 13B   URL Share it on

  Arxiv:2306.02707   Autotrain compatible   Dataset:psmathur/alpaca orca   Dataset:psmathur/dolly-v2 orca   Dataset:psmathur/wizardlm orca   En   Endpoints compatible   Llama   Model-index   Pytorch   Region:us   Safetensors   Sharded

Orca Mini 13b Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Orca Mini 13B (pankajmathur/orca_mini_13b)

Orca Mini 13B Parameters and Internals

Model Type 
text generation
Use Cases 
Applications:
Research applications, Designing text-generation tasks
Limitations:
This model can produce factually incorrect output, and should not be relied on to produce factually accurate information.
Additional Notes 
The model is sensitive to input formats and may require tuning to adapt to specific applications.
Training Details 
Data Sources:
WizardLM dataset ~70K, Alpaca dataset ~52K, Dolly-V2 dataset ~15K
Methodology:
Explain tuned using Orca Research Paper dataset construction approaches
Context Length:
1024
Training Time:
15 Hours
Hardware Used:
8x A100(80G) GPUs
Release Notes 
Version:
1.0
Date:
2023-10-01
Notes:
Initial release of orca_mini_13b with fine-tuning on explain tuned datasets using Orca Research Paper methods.
LLM NameOrca Mini 13b
Repository ๐Ÿค—https://huggingface.co/pankajmathur/orca_mini_13b 
Model Size13b
Required VRAM52.3 GB
Updated2024-12-06
Maintainerpankajmathur
Model Typellama
Model Files  4.9 GB: 1-of-11   5.0 GB: 2-of-11   5.0 GB: 3-of-11   5.0 GB: 4-of-11   5.0 GB: 5-of-11   4.8 GB: 6-of-11   4.8 GB: 7-of-11   4.8 GB: 8-of-11   5.0 GB: 9-of-11   5.0 GB: 10-of-11   3.0 GB: 11-of-11
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licensecc-by-nc-sa-4.0
Context Length2048
Model Max Length2048
Transformers Version4.29.1
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat32

Quantized Models of the Orca Mini 13B

Model
Likes
Downloads
VRAM
Orca Mini V3 13b31382026 GB

Best Alternatives to Orca Mini 13B

Best Alternatives
Context / RAM
Downloads
Likes
Luminaura RP 13B128K / 26 GB190
Yarn Llama 2 13B 128K128K / 26 GB51114
Agent Llama2 13B 80K80K / 26.4 GB140
Chat Llama2 13B 80K80K / 52.8 GB150
LongAlign 13B 64K64K / 26 GB19013
LongAlign 13B 64K Base64K / 26 GB123
Openbuddy Llama2 13B V15p1 64K64K / 26.1 GB264
Openbuddy Llama2 13b64k V1564K / 26.1 GB161
Yarn Llama 2 13B 64K64K / 26 GB3618
Airoboros L2 13B 2.1 YaRN 64K64K / 26 GB207
Note: green Score (e.g. "73.2") means that the model is better than pankajmathur/orca_mini_13b.

Rank the Orca Mini 13B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 38920 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241124