Hybrid Linear Attention Done Right: Efficient Distillation and Effective Architectures for Extremely Long Contexts | ScienceToStartup | ScienceToStartup