QuietStar Project by LeroyDyer

 ยป  All LLMs  ยป  LeroyDyer  ยป  QuietStar Project   URL Share it on

  Autotrain compatible   Biology   Ca   Chemistry   Climate   Code   Custom code   En   Es   Ha   Ig   Legal   Medical   Mistral   Mistral quiet   Mistral star   Mixtral   Pt   Question-answer   Region:us   Sequence-classification   Spydazweb-ai   Sw   Token-classification   Zu

QuietStar Project Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
QuietStar Project (LeroyDyer/QuietStar_Project)

QuietStar Project Parameters and Internals

Model Type 
Question-Answer, Token-Classification, Sequence-Classification, Text Generation Inference
Use Cases 
Areas:
research, commercial applications
Applications:
role play, medical resources, technological development, historical document storage
Primary Use Cases:
Constructing shelters, Developing technology, Medical diagnosis and reporting, Historical data retrieval
Additional Notes 
The model is trained for multi-task operations, utilizing Chain of Thoughts, Agent generation, Mark Down with mermaid, and internal preprocessing with RAG systems for tasks.
Supported Languages 
en (full), sw (full), ig (full), zu (full), ca (full), es (full), pt (full), ha (full)
Training Details 
Data Sources:
Hugging Face hub, Kaggle
Methodology:
Chain of thoughts, graph of thoughts, tree of thoughts, dual agent response generation, agent ranking, function calling, self-guiding methods.
Context Length:
32000
Release Notes 
Version:
v0.1
Notes:
32k context window, Rope-theta = 1e6, No Sliding-Window Attention.
LLM NameQuietStar Project
Repository ๐Ÿค—https://huggingface.co/LeroyDyer/QuietStar_Project 
Updated2025-01-17
MaintainerLeroyDyer
Supported Languagesen sw ig zu ca es pt ha
Model ArchitectureAutoModelForCausalLM
Licensemit
Model Max Length32768
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32768

Best Alternatives to QuietStar Project

Best Alternatives
Context / RAM
Downloads
Likes
Tiny Llama Miniguanaco2K / 2.2 GB161
Fine Tune Sentimental Llama0K / 0 GB70
VLM2Vec LoRA0K / 0 GB927
Finetuned Llava Lora0K / 0.1 GB220
Alphace Email0K / 0.1 GB200
Qwen7B Haiguitang0K / 15.3 GB190
Modelv30K / 13.5 GB170
Accel0K / 0 GB170
Q Align Cap Iaa Lora Scst0K / 0.2 GB610
Partis Goodone0K / 16.1 GB151
Note: green Score (e.g. "73.2") means that the model is better than LeroyDyer/QuietStar_Project.

Rank the QuietStar Project Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227