LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Galactica Orca Wizardlm 1.3B by KnutJaegersberg

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  KnutJaegersberg  ยป  Galactica Orca Wizardlm 1.3B   URL Share it on

  Autotrain   Autotrain compatible   Endpoints compatible   Has space   License:cc-by-nc-4.0   Opt   Pytorch   Region:us   Safetensors

Rank the Galactica Orca Wizardlm 1.3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Galactica Orca Wizardlm 1.3B (KnutJaegersberg/galactica-orca-wizardlm-1.3b)

Best Alternatives to Galactica Orca Wizardlm 1.3B

Best Alternatives
HF Rank
Opt 1.3B Onnx Static Shapes2K /  GB320
Opt 1.3B Onnx Js Quantized2K /  GB70
Opt 1.3B Onnx Js2K /  GB60
...Opt 1.3B Rlhf Critic Deepspeed2K / 0.7 GB33
Opt 1.3B2K / 1.3 GB50
Facebook Opt 1.3b Quantized2K / 1.4 GB26130
Opt 1.3B2K / 2.6 GB145552134
Galactica 1.3B2K / 2.6 GB534456
Opt Iml Max 1.3B2K / 2.6 GB1175140
Opt Iml 1.3B2K / 2.6 GB69028

Galactica Orca Wizardlm 1.3B Parameters and Internals

LLM NameGalactica Orca Wizardlm 1.3B
RepositoryOpen on ๐Ÿค— 
Model Size1.3b
Required VRAM5.3 GB
Model Typeopt
Model Files  5.3 GB   5.3 GB
Model ArchitectureOPTForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.32.0.dev0
Tokenizer ClassPreTrainedTokenizerFast
Vocabulary Size50000
Torch Data Typefloat32
Activation Functiongelu
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003