Cosmic 2 by Tilo15

 ยป  All LLMs  ยป  Tilo15  ยป  Cosmic 2   URL Share it on

  Autotrain   Autotrain compatible Base model:quantized:tilo15/bi...   Base model:tilo15/big-text-3   Conversational   Endpoints compatible   Ggml   Gguf   Imatrix   Lora   Mistral   Peft   Quantized   Region:us   Safetensors   Sharded   Tensorboard   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/Tilo15/cosmic-2 

Cosmic 2 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Cosmic 2 (Tilo15/cosmic-2)

Cosmic 2 Parameters and Internals

Model Type 
text-generation
Additional Notes 
Model Trained Using AutoTrain. For more information, please visit AutoTrain: https://hf.co/docs/autotrain
Training Details 
Methodology:
Model trained using AutoTrain.
LLM NameCosmic 2
Repository ๐Ÿค—https://huggingface.co/Tilo15/cosmic-2 
Base Model(s)  Tilo15/big-text-3   Tilo15/big-text-3
Model Size7.2b
Required VRAM14.4 GB
Updated2025-01-16
MaintainerTilo15
Model Files  0.2 GB   0.5 GB: 1-of-9   0.5 GB: 2-of-9   0.5 GB: 3-of-9   0.5 GB: 4-of-9   0.5 GB: 5-of-9   0.5 GB: 6-of-9   0.5 GB: 7-of-9   0.5 GB: 8-of-9   0.2 GB: 9-of-9   1.8 GB   4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3   0.0 GB
GGML QuantizationYes
GGUF QuantizationYes
Quantization Typeggml|gguf
Model ArchitectureAutoModelForCausalLM
Licenseother
Is Biasednone
Tokenizer ClassLlamaTokenizer
Padding Token</s>
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesgate_proj|down_proj|v_proj|up_proj|o_proj|k_proj|q_proj
LoRA Alpha32
LoRA Dropout0.05
R Param16

Best Alternatives to Cosmic 2

Best Alternatives
Context / RAM
Downloads
Likes
Mistral7b V0.3 Alpaca Cleaned0K / 14.5 GB180
Mistralmeme0K / 14.4 GB140
My Mistral Seekh Qna0K / 28.9 GB130
Futfut By Zephyr7b0K / 14.4 GB30660
Zephyr Tuning V10K / 14.4 GB17470
Spydaz Web AI 0250K / 14.4 GB440
Small Fut Final0K / 14.4 GB211
Big Fut Final0K / 14.4 GB191
Last Model Too Long Time0K / 14.4 GB81
DummySFT0K / 28.9 GB210
Note: green Score (e.g. "73.2") means that the model is better than Tilo15/cosmic-2.

Rank the Cosmic 2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227