LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

AshhLimaRP Mistral 7B by lemonilia

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  lemonilia  ยป  AshhLimaRP Mistral 7B   URL Share it on

  Autotrain compatible   Endpoints compatible   Gguf   License:apache-2.0   Lora   Mistral   Pytorch   Quantized   Region:us   Sharded

Rank the AshhLimaRP Mistral 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
AshhLimaRP Mistral 7B (lemonilia/AshhLimaRP-Mistral-7B)

Quantized Models of the AshhLimaRP Mistral 7B

Model
Likes
Downloads
VRAM
AshhLimaRP Mistral 7B GPTQ224 GB
AshhLimaRP Mistral 7B AWQ114 GB
AshhLimaRP Mistral 7B GGUF113 GB

Best Alternatives to AshhLimaRP Mistral 7B

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Medilora Mistral 7B64.410K / 14.4 GB53
Asimov 7B V154.980K / 0.2 GB19861
...rix Philosophy Mistral 7B LoRA53.90K / 14.4 GB111
....5 Strix Philosophy Mistral 7B53.90K / 14.4 GB110
Asimov 7B V252.290K / 0.2 GB19920
V350.50K / 0 GB180
Cartesigpt46.90K / 0.4 GB350
Legal Summarizer0K / 14.4 GB110
Full V4 Astromistral Final32K / 4.5 GB2100
7B XXL0K /  GB70
Note: green Score (e.g. "73.2") means that the model is better than lemonilia/AshhLimaRP-Mistral-7B.

AshhLimaRP Mistral 7B Parameters and Internals

LLM NameAshhLimaRP Mistral 7B
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM14.4 GB
Updated2024-02-21
Maintainerlemonilia
Model Files  4.4 GB   5.9 GB   2.7 GB   4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3
GGUF QuantizationYes
Quantization Typegguf
Model ArchitectureAutoModelForCausalLM
Licenseapache-2.0
Is Biasednone
Tokenizer ClassLlamaTokenizer
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesk_proj|gate_proj|v_proj|q_proj|o_proj|up_proj|down_proj
LoRA Alpha16
LoRA Dropout0
R Param256
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003