LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Full Transformer 3 by CLMBR

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  CLMBR  ยป  Full Transformer 3   URL Share it on

  Autotrain compatible   Endpoints compatible   Generated from trainer   Opt   Pytorch   Region:us

Rank the Full Transformer 3 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Full Transformer 3 (CLMBR/full-transformer-3)

Best Alternatives to Full Transformer 3

Best Alternatives
HF Rank
Onnx2K /  GB280
Koboldcpp2K / 0 GB34
Pushkar OPT Paraphaser2K / 1.3 GB21
...Step3 Rlhf Actor Model Opt1.3B2K / 3.2 GB31
Dalio Pretrain Cleaned V42K / 121.7 GB91
Dalio Principles Pretrain V12K / 121.7 GB51
Full Transformer 00.5K / 0.3 GB140
Old Full Transformer 00.5K / 0.3 GB130
Npi Sent Neg Transformer 40.5K / 0.3 GB110
Full Transformer 10.5K / 0.3 GB70

Full Transformer 3 Parameters and Internals

LLM NameFull Transformer 3
RepositoryOpen on ๐Ÿค— 
Required VRAM0.3 GB
Model Typeopt
Model Files  0.3 GB   0.0 GB
Model ArchitectureOPTForCausalLM
Context Length512
Model Max Length512
Transformers Version4.33.3
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad>
Vocabulary Size50002
Torch Data Typefloat32
Activation Functionrelu
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003