Show HN: SGR – A Linear-Complexity "Living Cell" Outperforming Transformers https://ift.tt/HxCseSi

Show HN: SGR – A Linear-Complexity "Living Cell" Outperforming Transformers I am developing an architecture called Sparse Gated Resonance (SGR). It is a sequence modeling approach designed to avoid the quadratic scaling of traditional Self-Attention. I have been benchmarking a 722k-parameter SGR against a 921k-parameter Transformer on Victor Hugo’s "Notre-Dame de Paris" (English). The SGR replaces the attention mechanism with a "Causal Pulse." It uses gated 1D convolutions to generate a navigation vector that resonates against a brain-map of character embeddings. This allows the model to maintain a "Living Cell" state that updates with linear complexity. Full source and implementation: https://ift.tt/VhEGSwq Benchmarking Data (Notre-Dame de Paris): STEP 3900 ARCH | LOSS | PPL | ENT | TIME SGR | 1.4481 | 4.26 | 1.5476 | 19.0ms STD | 2.0275 | 7.59 | 2.1476 | 40.3ms Semantic Comparison (Generation from "Quasimodo"): SGR: "Quasimodo. Then minds that the accasteady which which the" STD: "Quasimododo ng, o uer tre the todo hemo’He wand at tine." Technical Observations: Computational Efficiency: SGR maintains a significant latency advantage, consistently running at ~19ms compared to the Transformer's ~40ms. This confirms the efficiency of the linear pulse over quadratic attention. Convergence Quality: By Step 3700, SGR reached a Perplexity (PPL) of 4.46, whereas the Transformer lagged at 8.36. SGR successfully produces recognizable English phrases and punctuation, while the Transformer still exhibits "stuttering" artifacts (e.g., "Quasimodododod"). Entropy Stability: SGR has stabilized at an entropy of ~1.54, which represents the optimal "Mastery Zone" for English text. The Transformer’s higher entropy (~2.14) correlates with its lack of structural coherence. I am seeking an endorsement to publish a formal paper on this architecture to arXiv (CS.ML). I believe these results demonstrate that "Living Cell" resonance models can outperform Attention in parameter-constrained and latency-sensitive environments. If you are a researcher willing to endorse or review the mathematical formalization, please contact me via GitHub. January 22, 2026 at 02:03AM

No comments:

Show HN: Take a Break – a gentle extension to stop autoplay late at night https://ift.tt/Poz1qSi

Show HN: Take a Break – a gentle extension to stop autoplay late at night Hey HN — I built Take a Break, a Chrome extension that starts a ti...