Dopeystableplats 3B V1 by vihangd

 ยป  All LLMs  ยป  vihangd  ยป  Dopeystableplats 3B V1   URL Share it on

  Autotrain compatible   Custom code   Pytorch   Region:us   Sharded   Stablelm epoch

Dopeystableplats 3B V1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Dopeystableplats 3B V1 (vihangd/dopeystableplats-3b-v1)

Dopeystableplats 3B V1 Parameters and Internals

Additional Notes 
An experimental finetune of the StableLM-3B-4E1T model.
Training Details 
Data Sources:
alpaca style datasets
Methodology:
Finetuning with Alpaca-QLoRA and some DPO goodness
Input Output 
Input Format:
alpaca style prompt template
LLM NameDopeystableplats 3B V1
Repository ๐Ÿค—https://huggingface.co/vihangd/dopeystableplats-3b-v1 
Model Size3b
Required VRAM5.9 GB
Updated2025-03-12
Maintainervihangd
Model Typestablelm_epoch
Model Files  0.4 GB: 1-of-15   0.4 GB: 2-of-15   0.4 GB: 3-of-15   0.4 GB: 4-of-15   0.4 GB: 5-of-15   0.4 GB: 6-of-15   0.4 GB: 7-of-15   0.4 GB: 8-of-15   0.4 GB: 9-of-15   0.4 GB: 10-of-15   0.4 GB: 11-of-15   0.4 GB: 12-of-15   0.4 GB: 13-of-15   0.4 GB: 14-of-15   0.3 GB: 15-of-15
Model ArchitectureStableLMEpochForCausalLM
Licensecc-by-sa-4.0
Context Length4096
Model Max Length4096
Transformers Version4.34.1
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|endoftext|>
Vocabulary Size50304
Torch Data Typefloat16

Quantized Models of the Dopeystableplats 3B V1

Model
Likes
Downloads
VRAM
Dopeystableplats 3B V1 GGUF11281 GB
Dopeystableplats 3B V1 GPTQ0191 GB

Best Alternatives to Dopeystableplats 3B V1

Best Alternatives
Context / RAM
Downloads
Likes
Stable Code 3B Mlx16K / 5.6 GB181
Aura 3B4K / 5.6 GB162
Slim Extract4K / 5.6 GB5312
Slim Sa Ner4K / 5.6 GB666
Slim Boolean4K / 5.6 GB444
Slim Tags 3B4K / 5.6 GB534
Slim Summary4K / 5.6 GB597
Slim Xsum4K / 5.6 GB506
Memphis CoT 3B4K / 5.6 GB19629
Fett Uccine Mini 3B4K / 5.6 GB1423
Note: green Score (e.g. "73.2") means that the model is better than vihangd/dopeystableplats-3b-v1.

Rank the Dopeystableplats 3B V1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 44902 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227