SciPost Submission Page
Simplified derivations for high-dimensional convex learning problems
by David G. Clark, Haim Sompolinsky
This Submission thread is now published as
Submission summary
| Ontological classification |
| Academic field: |
Physics |
| Specialties: |
- Condensed Matter Physics - Theory
- Statistical and Soft Matter Physics
|
| Approach: |
Theoretical |
Abstract
Statistical-physics calculations in machine learning and theoretical neuroscience often involve lengthy derivations that obscure physical interpretation. Here, we give concise, non-replica derivations of several key results and highlight their underlying similarities. In particular, using a cavity approach, we analyze three high-dimensional learning problems: perceptron classification of points, perceptron classification of manifolds, and kernel ridge regression. These problems share a common structure--a bipartite system of interacting feature and datum variables--enabling a unified analysis. Furthermore, for perceptron-capacity problems, we identify a symmetry that allows derivation of correct capacities through a naive method.
Author comments upon resubmission
We sincerely thank the two referees for their thoughtful suggestions, which we have implemented.
List of changes
We have uploaded a detailed response PDF, in which the changes are enumerated in full, as part of our comments to the referees.