Facebook Opt 125M Qcqa Ub 6 Best For Q Loss by xformAI

 ยป  All LLMs  ยป  xformAI  ยป  Facebook Opt 125M Qcqa Ub 6 Best For Q Loss   URL Share it on

  Autotrain compatible   En   Endpoints compatible   Opt   Pytorch   Region:us

Facebook Opt 125M Qcqa Ub 6 Best For Q Loss Benchmarks

Facebook Opt 125M Qcqa Ub 6 Best For Q Loss (xformAI/facebook-opt-125m-qcqa-ub-6-best-for-q-loss)

Facebook Opt 125M Qcqa Ub 6 Best For Q Loss Parameters and Internals

Model Type 
transformers, quality-concerned query-aware question answering (QCQA)
Additional Notes 
The model is a QCQA version maintaining the original MHA architecture with modified K/V heads for efficiency.
Supported Languages 
en (proficient)
LLM NameFacebook Opt 125M Qcqa Ub 6 Best For Q Loss
Repository ๐Ÿค—https://huggingface.co/xformAI/facebook-opt-125m-qcqa-ub-6-best-for-q-loss 
Model Size125m
Required VRAM0.5 GB
Updated2025-04-30
MaintainerxformAI
Model Typeopt
Model Files  0.5 GB
Supported Languagesen
Model ArchitectureOPTForCausalLM
Licensemit
Context Length2048
Model Max Length2048
Transformers Version4.26.1
Tokenizer ClassGPT2Tokenizer
Beginning of Sentence Token</s>
End of Sentence Token</s>
Unk Token</s>
Vocabulary Size50272
Torch Data Typefloat32
Activation Functionrelu
Errorsreplace

Best Alternatives to Facebook Opt 125M Qcqa Ub 6 Best For Q Loss

Best Alternatives
Context / RAM
Downloads
Likes
...5M Qcqa Ub 6 Best For KV Cache2K / 0.5 GB380
...25M Gqa Ub 6 Best For KV Cache2K / 0.5 GB300
Opt 125M2K / 0.3 GB6448507198
Galactica 125M Cot2K / 0.5 GB60
Galactica Ref2K / 0.5 GB130
Galactica 125M DPO Pos2K / 0.5 GB130
Galactica 125M DPO2K / 0.5 GB120
BertQA2K / 0.5 GB60
BertQA2K / 0.5 GB50
Opt 125M Quantized Brevitas2K /  GB50
Note: green Score (e.g. "73.2") means that the model is better than xformAI/facebook-opt-125m-qcqa-ub-6-best-for-q-loss.

Rank the Facebook Opt 125M Qcqa Ub 6 Best For Q Loss Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 46860 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227