Post
29
π **Introducing Asterisk β Hybrid ASPP-Attention Architecture!** π
NoesisLab/Asterisk
Weβre excited to launch **Asterisk**, a cutting-edge language model by **NoesisLab** on **Hugging Face**! π Built on top of *SmolLM2-135M-Instruct*, Asterisk integrates **Adjacency-Structured Parallel Propagation (ASPP)** with standard attention to bring structured reasoning power into language modeling.
β¨ **Key Highlights:**
πΉ **Hybrid Architecture** β Fuses graph-centric ASPP local reasoning with global attention for richer representations.
πΉ **Enhanced Reasoning** β ASPP enables iterative local state evolution that complements traditional transformer layers.
πΉ **Efficient Design** β ~171M parameters with smart supervised fine-tuning (Capybara dataset).
πΉ **Flexible & Open** β Apache-2.0 licensed and ready to integrate via Hugging Face π€ Transformers.
π Asterisk showcases how hybrid operators β inspired by theoretical frameworks like the *Asterisk Operator* β can bring structured reasoning into modern LMs in a scalable way.
π Try it out, explore the code, and start building: *huggingface.co/NoesisLab/Asterisk*
#MachineLearning #AI #LanguageModels #HybridAI #HuggingFace #NoesisLab