LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Twindoc Mistral 7B Alpha V0.0 by jamesagilesoda

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  jamesagilesoda  ยป  Twindoc Mistral 7B Alpha V0.0   URL Share it on

  Arxiv:1910.09700   Autotrain compatible   Conversational   Endpoints compatible   Mistral   Region:us   Safetensors   Sharded   Tensorflow

Rank the Twindoc Mistral 7B Alpha V0.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Twindoc Mistral 7B Alpha V0.0 (jamesagilesoda/Twindoc-Mistral-7B-Alpha-v0.0)

Best Alternatives to Twindoc Mistral 7B Alpha V0.0

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Ogno Monarch Jaskier Merge 7B76.4332K / 14.5 GB592
Jaskier 7B DPO V5.676.4132K / 14.4 GB3027
Jaskier 7B DPO V6.176.3632K / 14.4 GB210
OGNO 7B76.3432K / 14.4 GB103611
Omningotex 7B Slerp76.3332K / 14.4 GB6123
StrangeMerges 25 7B Dare Ties76.3332K / 14.4 GB1200
DPO Binarized NeutrixOmnibe 7B76.3132K / 14.4 GB5342
OgnoMonarch 7B76.332K / 14.5 GB1680
StrangeMerges 21 7B Slerp76.2932K / 14.4 GB6580
Monarch 7B76.2532K / 14.4 GB8495
Note: green Score (e.g. "73.2") means that the model is better than jamesagilesoda/Twindoc-Mistral-7B-Alpha-v0.0.

Twindoc Mistral 7B Alpha V0.0 Parameters and Internals

LLM NameTwindoc Mistral 7B Alpha V0.0
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM28.9 GB
Updated2024-02-21
Maintainerjamesagilesoda
Model Typemistral
Model Files  5.0 GB: 1-of-6   4.9 GB: 2-of-6   5.0 GB: 3-of-6   5.0 GB: 4-of-6   4.8 GB: 5-of-6   4.2 GB: 6-of-6
Model ArchitectureMistralForCausalLM
Context Length32768
Model Max Length32768
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typefloat32
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003