# 2021 Geometric Deep Learning - Grids, Groups, Graphs, Geodesics, and Gauges

# # Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges

Metadata

CiteKey:: bronsteinGeometricDeepLearning2021Type:: journalArticleAuthor::Michael BronsteinJoan BrunaTaco CohenPetar VeličkovićEditor::Publisher::Series::Series Number::Journal:: arXiv:2104.13478 [cs, stat]Volume::Issue::Pages::Year:: 2021DOI::ISSN::ISBN::Format:: PDF

Abstract

The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods. Indeed, many high-dimensional learning tasks previously thought to be beyond reach – such as computer vision, playing Go, or protein folding – are in fact feasible with appropriate computational scale. Remarkably, the essence of deep learning is built from two simple algorithmic principles: first, the notion of representation or feature learning, whereby adapted, often hierarchical, features capture the appropriate notion of regularity for each task, and second, learning by local gradient-descent type methods, typically implemented as backpropagation. While learning generic functions in high dimensions is a cursed estimation problem, most tasks of interest are not generic, and come with essential pre-defined regularities arising from the underlying low-dimensionality and structure of the physical world. This text is concerned with exposing these regularities through unified geometric principles that can be applied throughout a wide spectrum of applications. Such a ‘geometric unification’ endeavour, in the spirit of Felix Klein’s Erlangen Program, serves a dual purpose: on one hand, it provides a common mathematical framework to study the most successful neural network architectures, such as CNNs, RNNs, GNNs, and Transformers. On the other hand, it gives a constructive procedure to incorporate prior physical knowledge into neural architectures and provide principled way to build future architectures yet to be invented.

Files and Links

Tags and Collections

Keywords:: 📥, Geometric Deep Learning, Graphs, GridsCollections:: Geometric DL

## # Annotations

### # Imported: 2022-08-29 2:04 pm

- ["] various geometries known at the time could be defined by an appropriate choice of symmetry transformations, formalized using the language of group theory Page 5
- ["] Euclidean group is a subgroup of the affine group, which in turn is a subgroup of the group of projective transformations
Page 5
- [n] This is another test comment.

- ["] The impact of the Erlangen Programme on geometry was very profound. Furthermore, it spilled to other fields, especially physics Page 5
- ["] There is a veritable zoo of neural network architectures for various kinds of data, but few unifying principles. Page 6
- ["] apply the Erlangen Programme mindset to the domain of deep learning Page 6
- ["] propose to derive different inductive biases and network architectures implementing them from first principles of symmetry and invariance Page 6
- ["] “The knowledge of certain principles easily compensates the lack of knowledge of certain facts.” (Helvétius, 1759) Page 6