Computational Hardness Of RepresentAtion Learning
CHORAL aims to bridge theory and practice in neural networks by quantifying learning costs through a statistical framework, enhancing representation learning for structured data and multi-layer networks.
Projectdetails
Introduction
Rich internal representations of complex data are crucial to the predictive power of neural networks. Unfortunately, current statistical analyses are restricted to over-simplified networks, whose representations (i.e., weight matrices) are either random, and/or project the data in comparatively very large or very low dimensional spaces; in many applications, the situation is very different. The modelization of realistic data is another issue. There is an urgent need to reconcile theory and practice.
Framework Overview
Based on a synergy of the mathematical physics of spin glasses, matrix-models from physics, and information and random matrix theory, CHORAL’s statistical framework will delimit computational gaps in the learning, from structured data, of much more realistic models of neural networks.
Quantifying Discrepancies
These gaps will quantify the discrepancy between:
- The statistical cost of learning good representations, i.e., the minimal amount of training data required to reach a satisfactory predictive performance.
- The cost of efficiency, i.e., the amount of data needed when learning using tractable algorithms, such as approximate message-passing and noisy gradient descents.
Comparing these costs will quantify when learning is computationally hard or not.
Focus Areas
To achieve this, CHORAL will first focus on dictionary learning, another essential task of representation learning, and then move on to multi-layer neural networks, which can be thought of as concatenated dictionary learning problems.
Impact
CHORAL’s ambitious program, by defining benchmarks for algorithms used in virtually all fields of science and technology, will have a direct practical impact. Equally important will be its conceptual impact: the study of information processing systems has become a major source of inspiration for mathematics.
Financiële details & Tijdlijn
Financiële details
Subsidiebedrag | € 1.280.750 |
Totale projectbegroting | € 1.280.750 |
Tijdlijn
Startdatum | 1-10-2022 |
Einddatum | 30-9-2027 |
Subsidiejaar | 2022 |
Partners & Locaties
Projectpartners
- UNITED NATIONS EDUCATIONAL SCIENTIFIC AND CULTURAL ORGANIZATIONpenvoerder
Land(en)
Vergelijkbare projecten binnen European Research Council
Project | Regeling | Bedrag | Jaar | Actie |
---|---|---|---|---|
Beyond two-point correlations: from higher-order data statistics to neural representationsbeyond2 aims to develop a theory on how deep neural networks learn from high-order correlations in non-Gaussian data, enhancing understanding and practical deployment in critical applications. | ERC Starting... | € 1.499.999 | 2025 | Details |
Reconciling Classical and Modern (Deep) Machine Learning for Real-World ApplicationsAPHELEIA aims to create robust, interpretable, and efficient machine learning models that require less data by integrating classical methods with modern deep learning, fostering interdisciplinary collaboration. | ERC Consolid... | € 1.999.375 | 2023 | Details |
From reconstructions of neuronal circuits to anatomically realistic artificial neural networksThis project aims to enhance artificial neural networks by extracting wiring principles from brain connectomics to improve efficiency and reduce training data needs for deep learning applications. | ERC Proof of... | € 150.000 | 2022 | Details |
Dynamics-Aware Theory of Deep LearningThis project aims to create a robust theoretical framework for deep learning, enhancing understanding and practical tools to improve model performance and reduce complexity in various applications. | ERC Starting... | € 1.498.410 | 2022 | Details |
AI-based Learning for Physical SimulationThis project aims to enhance physical simulations by integrating machine learning with equation-based modeling for improved generalization and intelligibility, applicable across scientific disciplines and engineering. | ERC Starting... | € 1.315.000 | 2022 | Details |
Beyond two-point correlations: from higher-order data statistics to neural representations
beyond2 aims to develop a theory on how deep neural networks learn from high-order correlations in non-Gaussian data, enhancing understanding and practical deployment in critical applications.
Reconciling Classical and Modern (Deep) Machine Learning for Real-World Applications
APHELEIA aims to create robust, interpretable, and efficient machine learning models that require less data by integrating classical methods with modern deep learning, fostering interdisciplinary collaboration.
From reconstructions of neuronal circuits to anatomically realistic artificial neural networks
This project aims to enhance artificial neural networks by extracting wiring principles from brain connectomics to improve efficiency and reduce training data needs for deep learning applications.
Dynamics-Aware Theory of Deep Learning
This project aims to create a robust theoretical framework for deep learning, enhancing understanding and practical tools to improve model performance and reduce complexity in various applications.
AI-based Learning for Physical Simulation
This project aims to enhance physical simulations by integrating machine learning with equation-based modeling for improved generalization and intelligibility, applicable across scientific disciplines and engineering.