Recent advancements in optimization algorithms are increasingly addressing complex real-world challenges across various domains. The integration of large language models into combinatorial optimization, particularly for vehicle routing problems, showcases a trend toward leveraging AI to enhance computational efficiency and solution quality. Additionally, the development of certificate-guided pruning techniques for stochastic optimization highlights a shift toward more reliable and principled methods that ensure optimality guarantees, which is crucial for applications in fields like finance and engineering. Hybrid approaches combining genetic algorithms with graph neural networks are also gaining traction, demonstrating significant improvements in scheduling tasks by effectively navigating search spaces. Furthermore, the introduction of gradient-regularized natural gradients emphasizes the importance of stability and generalization in deep learning, indicating a maturation of optimization techniques that prioritize robustness. Collectively, these efforts reflect a concerted push toward more efficient, reliable, and adaptable optimization strategies capable of tackling the increasing complexity of modern computational problems.
In game theory, imperfect-recall decision problems model situations in which an agent forgets information it held before. They encompass games such as the ``absentminded driver'' and team games with l...
We introduce Sven (Singular Value dEsceNt), a new optimization algorithm for neural networks that exploits the natural decomposition of loss functions into a sum over individual data points, rather th...
Stochastic Multi-Objective Optimization (SMOO) is critical for decision-making trading off multiple potentially conflicting objectives in uncertain environments. SMOO aims at identifying the Pareto fr...
The Capacitated Vehicle Routing Problem (CVRP), a fundamental combinatorial optimization challenge, focuses on optimizing fleet operations under vehicle capacity constraints. While extensively studied...
We study black-box optimization of Lipschitz functions under noisy evaluations. Existing adaptive discretization methods implicitly avoid suboptimal regions but do not provide explicit certificates of...
Bayesian optimization is a data-efficient technique that has been shown to be extremely powerful to optimize expensive, black-box, and possibly noisy objective functions. Many applications involve opt...
Recent advances in spectral optimization, notably Muon, have demonstrated that constraining update steps to the Stiefel manifold can significantly accelerate training and improve generalization. Howev...
This paper investigates the impact of hybridizing a multi-modal Genetic Algorithm with a Graph Neural Network for timetabling optimization. The Graph Neural Network is designed to encapsulate general ...
Gradient regularization (GR) has been shown to improve the generalizability of trained models. While Natural Gradient Descent has been shown to accelerate optimization in the initial phase of training...
Hyperparameter tuning is a critical yet computationally expensive step in training neural networks, particularly when the search space is high dimensional and nonconvex. Metaheuristic optimization alg...