ZeroS: Zero-Sum Linear Attention for Efficient Transformers | ScienceToStartup | ScienceToStartup