Fireball 12B V1.01a by EpistemeAI2

 ยป  All LLMs  ยป  EpistemeAI2  ยป  Fireball 12B V1.01a   URL Share it on

  Autotrain compatible Base model:epistemeai/fireball... Base model:finetune:epistemeai...   En   Endpoints compatible   Mistral   Pytorch   Region:us   Sharded   Trl   Unsloth

Fireball 12B V1.01a Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Fireball 12B V1.01a (EpistemeAI2/Fireball-12B-v1.01a)

Fireball 12B V1.01a Parameters and Internals

Model Type 
text generation, transformers
Use Cases 
Areas:
research, commercial applications
Additional Notes 
EpistemeAI/Fireball-12B is a pretrained base model without moderation mechanisms. For moderation, refer to the SentinelShield AI GitHub repository.
Supported Languages 
en (proficient)
Training Details 
Data Sources:
candenizkocak/code-alpaca-297k, yahma/alpaca-cleaned, reciperesearch/dolphin-sft-v0.1-preference
Methodology:
fine-tuning with ORPO method and Unsloth for optimization
Context Length:
128000
Model Architecture:
Transformer model with 40 layers, 5,120 dimensions, 128 head dimensions, 14,436 hidden dimensions, SwiGLU activation function, 32 heads, 8 kv-heads, vocabulary size 128k, and rotary embeddings with theta = 1M
Input Output 
Accepted Modalities:
text
Performance Tips:
Using a temperature of 0.3 is recommended for better performance.
LLM NameFireball 12B V1.01a
Repository ๐Ÿค—https://huggingface.co/EpistemeAI2/Fireball-12B-v1.01a 
Base Model(s)  EpistemeAI/Fireball-12B-v1.0   EpistemeAI/Fireball-12B-v1.0
Model Size12b
Required VRAM24.5 GB
Updated2025-01-16
MaintainerEpistemeAI2
Model Typemistral
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length1024000
Model Max Length1024000
Transformers Version4.44.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad>
Vocabulary Size131072
Torch Data Typefloat16

Best Alternatives to Fireball 12B V1.01a

Best Alternatives
Context / RAM
Downloads
Likes
...r Nemo 12B Instruct R 21 09 241000K / 24.5 GB920799
...s PersonalityEngine V1.1.0 12B1000K / 24.5 GB31120
Captain Eris Violet V0.420 12B1000K / 24.5 GB29420
Mistral Nemo Bophades3 12B1000K / 24.5 GB331
Mistral Nemo Kartoffel 12B1000K / 24.5 GB361
MN 12B Mag Mell R11000K / 24.5 GB291389
Saiga Nemo 12b1000K / 24.5 GB13745433
Dolphin 2.9.3 Mistral Nemo 12B1000K / 24.5 GB770896
MISCHIEVOUS 12B Mix 0.4v1000K / 24.5 GB2372
Captain BMO 12B1000K / 24.5 GB113118
Note: green Score (e.g. "73.2") means that the model is better than EpistemeAI2/Fireball-12B-v1.01a.

Rank the Fireball 12B V1.01a Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227