Matter 0.1 Slim 7B A by 0-hero

 ยป  All LLMs  ยป  0-hero  ยป  Matter 0.1 Slim 7B A   URL Share it on

  Autotrain compatible   Conversational Dataset:0-hero/matter-0.1-slim...   En   Endpoints compatible   Mistral   Pytorch   Region:us   Safetensors

Matter 0.1 Slim 7B A Benchmarks

Matter 0.1 Slim 7B A Parameters and Internals

Model Type
language model, AI assistant
Additional NotesThe model supports function calling with specific tokens for function call initiation and responses.
Supported Languages
en (English)
Training Details
Data Sources:
0-hero/Matter-0.1-Slim-A
Data Volume:~285k rows, curated from over 35 datasets analyzing >6B tokens
Methodology:finetuned
Training Time:~15 hours
Hardware Used:
4x A100s (80GB)
Input Output
Input Format:ChatML prompt format
LLM NameMatter 0.1 Slim 7B A
Repository ๐Ÿค—https://huggingface.co/0-hero/Matter-0.1-Slim-7B-A 
Model Size7b
Required VRAM14.5 GB
Updated2024-11-09
Maintainer0-hero
Model Typemistral
Model Files  14.5 GB   14.5 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32006
Torch Data Typebfloat16
Matter 0.1 Slim 7B A (0-hero/Matter-0.1-Slim-7B-A)

Best Alternatives to Matter 0.1 Slim 7B A

Best Alternatives
Context / RAM
Downloads
Likes
...Nemo Instruct 2407 Abliterated1000K / 24.5 GB26006
MegaBeam Mistral 7B 512K512K / 14.4 GB454841
SpydazWeb AI HumanAI RP512K / 14.4 GB881
SpydazWeb AI HumanAI 002512K / 14.4 GB551
...daz Web AI ChatML 512K Project512K / 14.5 GB120
MegaBeam Mistral 7B 300K282K / 14.4 GB309315
Hebrew Mistral 7B 200K256K / 30 GB316215
Astral 256K 7B250K / 14.4 GB150
Astral 256K 7B V2250K / 14.4 GB80
Boptruth Agatha 7B128K / 14.4 GB6520
Note: green Score (e.g. "73.2") means that the model is better than 0-hero/Matter-0.1-Slim-7B-A.

Rank the Matter 0.1 Slim 7B A Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 37727 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241110