Mm4.ascii.star by liminerity

 ยป  All LLMs  ยป  liminerity  ยป  Mm4.ascii.star   URL Share it on

  Autotrain compatible Base model:finetune:liminerity...   Base model:liminerity/mm4.star   Conversational   Dataset:gate369/alpaca-star Dataset:gate369/alpaca-star-as...   En   Endpoints compatible   Llama   Region:us   Safetensors   Sharded   Tensorflow   Trl   Unsloth

Mm4.ascii.star Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mm4.ascii.star (liminerity/mm4.ascii.star)

Mm4.ascii.star Parameters and Internals

Model Type 
text-generation-inference, transformers, unsloth, llama, trl
Training Details 
Data Sources:
gate369/alpaca-star-ascii, gate369/Alpaca-Star
Methodology:
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
LLM NameMm4.ascii.star
Repository ๐Ÿค—https://huggingface.co/liminerity/mm4.ascii.star 
Base Model(s)  liminerity/mm4.star   liminerity/mm4.star
Model Size3b
Required VRAM6 GB
Updated2025-01-15
Maintainerliminerity
Model Typellama
Model Files  5.0 GB: 1-of-2   1.0 GB: 2-of-2
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.40.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size122753
Torch Data Typebfloat16

Best Alternatives to Mm4.ascii.star

Best Alternatives
Context / RAM
Downloads
Likes
Llama 3.2 3B Instruct128K / 6.5 GB1152433884
Llama 3.2 3B128K / 6.5 GB416524452
Hermes 3 Llama 3.2 3B128K / 6.5 GB22279124
Orca Mini V9 5 3B Instruct128K / 6.5 GB3506
Dolphin3.0 Llama3.2 3B128K / 6.5 GB379623
Llama Deepsync 3B128K / 6.5 GB37815
Calme 3.1 Llamaloi 3B128K / 10.6 GB26231
Llama 3.2 Korean Bllossom 3B128K / 6.5 GB23736138
Orca Mini V9 6 3B Instruct128K / 6.5 GB604
Llama 3.2 3B RP DeepThink128K / 7.2 GB2171
Note: green Score (e.g. "73.2") means that the model is better than liminerity/mm4.ascii.star.

Rank the Mm4.ascii.star Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41363 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227