S0 Tuning: Zero-Overhead Adaptation of Hybrid Recurrent-Attention Models | ScienceToStartup | ScienceToStartup