Recent advances in assistive robotics are focusing on enhancing user interaction and personalization, addressing critical challenges in mobility and daily living. Work on gaze-driven neck exoskeletons is leveraging virtual reality to refine head movement predictions based on eye gaze, paving the way for more intuitive user interfaces. Similarly, exosuits are being equipped with machine learning modules that adapt to users' walking modes using minimal sensory input, improving their functionality in real-world scenarios. The development of quadrupedal assistance robots is also gaining traction, with systems designed for shared autonomy that allow users to maintain independence while performing complex tasks. Additionally, efforts to integrate verbal communication into robotic guide dogs aim to facilitate collaborative navigation for visually impaired users. These innovations collectively highlight a shift towards more adaptive, user-centered designs in assistive technology, with the potential to significantly improve the quality of life for individuals with disabilities.