Cognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed by PrunaAI

 ยป  All LLMs  ยป  PrunaAI  ยป  Cognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed   URL Share it on

  1bit   Autotrain compatible Base model:cognitivecomputatio... Base model:finetune:cognitivec...   Endpoints compatible   Llama   Pruna-ai   Quantized   Region:us

Cognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Cognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed (PrunaAI/cognitivecomputations-Samantha-1.11-7b-HQQ-1bit-smashed)

Cognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed Parameters and Internals

Model Type 
text generation, compressed
Use Cases 
Areas:
commercial applications, research
Limitations:
Quality might vary compared to the base model
Additional Notes 
Results mentioning 'first' refer to metrics obtained after the first run, which might differ due to overheads.
Training Details 
Data Sources:
WikiText
Methodology:
compressed with hqq
Input Output 
Accepted Modalities:
text
Performance Tips:
The first run might be slower or take more memory due to CUDA overheads.
LLM NameCognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed
Repository ๐Ÿค—https://huggingface.co/PrunaAI/cognitivecomputations-Samantha-1.11-7b-HQQ-1bit-smashed 
Base Model(s)  cognitivecomputations/Samantha-1.11-7b   cognitivecomputations/Samantha-1.11-7b
Model Size7b
Required VRAM1.5 GB
Updated2025-02-22
MaintainerPrunaAI
Model Typellama
Model Files  1.5 GB
Quantization Type1bit
Model ArchitectureLlamaForCausalLM
Context Length4096
Model Max Length4096
Transformers Version4.40.0
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Cognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed

Best Alternatives
Context / RAM
Downloads
Likes
Smaugv0.1 6.0bpw H6 EXL2195K / 26.4 GB94
Smaugv0.1 5.0bpw H6 EXL2195K / 22.3 GB63
Smaugv0.1 4.65bpw H6 EXL2195K / 20.8 GB71
Smaugv0.1 3.0bpw H6 EXL2195K / 13.9 GB51
Smaugv0.1 4.0bpw H6 EXL2195K / 18 GB41
Smaugv0.1 8.0bpw H8 EXL2195K / 34.9 GB41
Mistral 7B Openplatypus 1K32K / 29 GB18650
Mistral 7B OpenOrca 1K32K / 29 GB18663
Mistral 7B A U0.5 B2 Ver0.432K / 14.4 GB20150
Mistral 7B OP U1k Ver0.632K / 14.4 GB20120

Rank the Cognitivecomputations Samantha 1.11 7B HQQ 1bit Smashed Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227