LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Pigeon 7B by Typly

What open-source LLMs or SLMs are you in search of? 18870 in total.

 ยป  All LLMs  ยป  Typly  ยป  Pigeon 7B   URL Share it on

  Autotrain   Autotrain compatible   En   Endpoints compatible   License:openrail   Llama   Lora   Pl   Pytorch   Region:us   Sharded   Tensorboard

Rank the Pigeon 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Pigeon 7B (Typly/Pigeon-7B)

Best Alternatives to Pigeon 7B

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Medilora Mistral 7B64.410K / 14.4 GB73
Asimov 7B V154.980K / 0.2 GB17761
...rix Philosophy Mistral 7B LoRA53.90K / 14.4 GB111
Asimov 7B V252.290K / 0.2 GB18000
Cartesigpt46.90K / 0.4 GB350
Mistral Alpaca Lora Full32K / 4.1 GB1260
Full V4 Astromistral Final32K / 4.5 GB2100
7B XXL0K /  GB70
Llama 7B Hf Prompt Answering0K / 0 GB223
ChatHaruhi RolePlaying Qwen 7b0K / 0 GB1520
Note: green Score (e.g. "73.2") means that the model is better than Typly/Pigeon-7B.

Pigeon 7B Parameters and Internals

LLM NamePigeon 7B
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM13.5 GB
Updated2024-02-29
MaintainerTyply
Model Files  0.0 GB   10.0 GB: 1-of-2   3.5 GB: 2-of-2   0.0 GB
Supported Languagespl en
Model ArchitectureAutoModelForCausalLM
Licenseopenrail
Is Biasednone
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesq_proj|v_proj
LoRA Alpha32
LoRA Dropout0.05
R Param16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003