The Complexity of Dynamic Matrix Problems

This project aims to enhance dynamic data structures for efficient matrix operations, optimizing algorithms in both convex and non-convex settings, particularly for deep neural networks and AI applications.

Subsidie
€ 1.439.413
2022

Projectdetails

Introduction

Modern data analysis and optimization rely on our ability to process rapidly evolving dynamic datasets, often involving matrix operations in very high dimensions. Dynamic data structures enable fast information retrieval on these huge databases by maintaining implicit information on the underlying data. As such, understanding the power and limitations of dynamic (matrix) data structures is a fundamental question in theory and practice.

Despite decades of research, there are still very basic dynamic problems whose complexity is (exponentially) far from understood. Bridging this gap is one of the centerpieces of this proposal.

Advancing Dynamic Data Structures

The second theme of this proposal is advancing the nascent role of dynamic data structures in continuous optimization. For over a century, the traditional focus of optimization research was on minimizing the rate of convergence of local-search methods.

The last ∼3 years have witnessed the dramatic potential of dynamic data structures in reducing the cost-per-iteration of (Newton type) optimization algorithms. This indicates that the bottleneck to accelerating literally thousands of algorithms is the efficient maintenance of dynamic matrix functions.

This new framework is only at its early stages but has already led to breakthroughs on decade-old problems in computer science. This proposal will substantially develop this interdisciplinary theory and identify the mathematical machinery which would lead to ultra-fast first and second-order convex optimization.

Non-Convex Setting

In the non-convex setting, this proposal demonstrates the game-changing potential of dynamic data structures and algebraic sketching techniques in achieving scalable training and inference of deep neural networks, a major challenge of modern AI.

Our program is based on a novel connection of kernel methods and compressed sensing techniques for approximate matrix multiplication.

Financiële details & Tijdlijn

Financiële details

Subsidiebedrag€ 1.439.413
Totale projectbegroting€ 1.439.413

Tijdlijn

Startdatum1-7-2022
Einddatum30-6-2027
Subsidiejaar2022

Partners & Locaties

Projectpartners

  • THE HEBREW UNIVERSITY OF JERUSALEMpenvoerder

Land(en)

Israel

Vergelijkbare projecten binnen European Research Council

ERC Starting...

Dynamics-Aware Theory of Deep Learning

This project aims to create a robust theoretical framework for deep learning, enhancing understanding and practical tools to improve model performance and reduce complexity in various applications.

€ 1.498.410
ERC Consolid...

New Frontiers in Projection-Free Methods for Continuous Optimization

This project aims to advance projection-free optimization methods to match the efficiency of projection-based techniques, enhancing continuous optimization for complex, structured constraints in data science and AI.

€ 1.785.000
ERC Consolid...

Dynamic Selection and Configuration of Black-box Optimization Algorithms

The dynaBBO project aims to enhance black-box optimization by dynamically selecting and switching algorithms based on problem instances and stages, validated in bio-medicine and computational mechanics.

€ 1.999.975
ERC Starting...

Towards a New Theory of Optimal Dynamic Graph Algorithms

This project aims to systematically study and establish intrinsically optimal dynamic graph algorithms with constant update times, potentially revolutionizing the field and enhancing real-world applications.

€ 1.400.000
ERC Starting...

Systematic and computer-aided performance certification for numerical optimization

The project aims to enhance theoretical foundations of numerical optimization to bridge the gap between theory and practice, developing robust algorithms and certification tools for complex applications.

€ 1.497.650