Deacon 34B by KnutJaegersberg

 ยป  All LLMs  ยป  KnutJaegersberg  ยป  Deacon 34B   URL Share it on

  Autotrain compatible Dataset:totally-not-an-llm/eve...   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow

Deacon 34B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Deacon 34B (KnutJaegersberg/Deacon-34B)

Deacon 34B Parameters and Internals

Model Type 
text-generation
Additional Notes 
This model has been llamafied and uses a llama tokenizer. There's a pun related to the name with the 3b version.
Training Details 
Data Sources:
totally-not-an-llm/EverythingLM-data-V3
Methodology:
fine-tuned with NEFTune
Model Architecture:
llama tokenizer
LLM NameDeacon 34B
Repository ๐Ÿค—https://huggingface.co/KnutJaegersberg/Deacon-34B 
Model Size34b
Required VRAM69.2 GB
Updated2025-02-05
MaintainerKnutJaegersberg
Model Typellama
Model Files  4.8 GB: 1-of-15   4.8 GB: 2-of-15   5.0 GB: 3-of-15   4.8 GB: 4-of-15   4.8 GB: 5-of-15   5.0 GB: 6-of-15   4.8 GB: 7-of-15   4.8 GB: 8-of-15   5.0 GB: 9-of-15   4.8 GB: 10-of-15   4.8 GB: 11-of-15   5.0 GB: 12-of-15   4.8 GB: 13-of-15   4.8 GB: 14-of-15   1.2 GB: 15-of-15
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.35.0
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size64000
Torch Data Typefloat16

Quantized Models of the Deacon 34B

Model
Likes
Downloads
VRAM
Deacon 34B GGUF214714 GB
Deacon 34B GPTQ22218 GB
Deacon 34B AWQ2519 GB

Best Alternatives to Deacon 34B

Best Alternatives
Context / RAM
Downloads
Likes
Casual Magnum 34B195K / 68.8 GB81
34B Beta195K / 69.2 GB429262
Bagel Hermes 34B Slerp195K / 68.9 GB44501
Bagel 34B V0.2195K / 68.7 GB616440
Smaug 34B V0.1195K / 69.2 GB431360
Yi 34B 200K195K / 68.9 GB5249318
Yi 34B 200K AEZAKMI V2195K / 69.2 GB134212
Smaug 34B V0.1 ExPO195K / 69.2 GB30560
Faro Yi 34B195K / 69.2 GB42046
Bagel DPO 34B V0.5195K / 68.7 GB280417
Note: green Score (e.g. "73.2") means that the model is better than KnutJaegersberg/Deacon-34B.

Rank the Deacon 34B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227