M7 1.58bit 6x70m by liminerity

 ยป  All LLMs  ยป  liminerity  ยป  M7 1.58bit 6x70m   URL Share it on

  Autotrain compatible Base model:finetune:liminerity... Base model:liminerity/bitnet-m...   Endpoints compatible   Frankenmoe   Lazymergekit   Liminerity/bitnet-m7-70m   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

M7 1.58bit 6x70m Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
M7 1.58bit 6x70m (liminerity/m7-1.58bit-6x70m)

M7 1.58bit 6x70m Parameters and Internals

Model Type 
moe, frankenmoe
LLM NameM7 1.58bit 6x70m
Repository ๐Ÿค—https://huggingface.co/liminerity/m7-1.58bit-6x70m 
Base Model(s)  Bitnet M7 70M   Bitnet M7 70M   Bitnet M7 70M   Bitnet M7 70M   Bitnet M7 70M   liminerity/Bitnet-M7-70m   liminerity/Bitnet-M7-70m   liminerity/Bitnet-M7-70m   liminerity/Bitnet-M7-70m   liminerity/Bitnet-M7-70m
Model Size134.1m
Required VRAM0.5 GB
Updated2025-02-22
Maintainerliminerity
Model Typemixtral
Model Files  0.5 GB: 1-of-1
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length768
Model Max Length768
Transformers Version4.41.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat32

Rank the M7 1.58bit 6x70m Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227