### Abstract

This study deals with neural networks in the sense of geometric transformations acting on the coordinate representation of the underlying data manifold which the data is sampled from. It forms part of an attempt to construct a formalized general theory of neural networks in the setting of Riemannian geometry. From this perspective, the following theoretical results are developed and proven for feedforward networks. First it is shown that residual neural networks are finite difference approximations to dynamical systems of first order differential equations, as opposed to ordinary networks that are static. This implies that the network is learning systems of differential equations governing the coordinate transformations that represent the data. Second it is shown that a closed form solution of the metric tensor on the underlying data manifold can be found by backpropagating the coordinate representations learned by the neural network itself. This is formulated in a formal abstract sense as a sequence of Lie group actions on the metric fibre space in the principal and associated bundles on the data manifold. Toy experiments were run to confirm parts of the proposed theory, as well as to provide intuitions as to how neural networks operate on data.

Original language | English (US) |
---|---|

Pages (from-to) | 2808-2817 |

Number of pages | 10 |

Journal | Advances in Neural Information Processing Systems |

Volume | 2017-December |

State | Published - Jan 1 2017 |

Event | 31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States Duration: Dec 4 2017 → Dec 9 2017 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Computer Networks and Communications
- Information Systems
- Signal Processing

### Cite this

*Advances in Neural Information Processing Systems*,

*2017-December*, 2808-2817.

}

*Advances in Neural Information Processing Systems*, vol. 2017-December, pp. 2808-2817.

**Principles of riemannian geometry in neural networks.** / Hauser, Michael; Ray, Asok.

Research output: Contribution to journal › Conference article

TY - JOUR

T1 - Principles of riemannian geometry in neural networks

AU - Hauser, Michael

AU - Ray, Asok

PY - 2017/1/1

Y1 - 2017/1/1

N2 - This study deals with neural networks in the sense of geometric transformations acting on the coordinate representation of the underlying data manifold which the data is sampled from. It forms part of an attempt to construct a formalized general theory of neural networks in the setting of Riemannian geometry. From this perspective, the following theoretical results are developed and proven for feedforward networks. First it is shown that residual neural networks are finite difference approximations to dynamical systems of first order differential equations, as opposed to ordinary networks that are static. This implies that the network is learning systems of differential equations governing the coordinate transformations that represent the data. Second it is shown that a closed form solution of the metric tensor on the underlying data manifold can be found by backpropagating the coordinate representations learned by the neural network itself. This is formulated in a formal abstract sense as a sequence of Lie group actions on the metric fibre space in the principal and associated bundles on the data manifold. Toy experiments were run to confirm parts of the proposed theory, as well as to provide intuitions as to how neural networks operate on data.

AB - This study deals with neural networks in the sense of geometric transformations acting on the coordinate representation of the underlying data manifold which the data is sampled from. It forms part of an attempt to construct a formalized general theory of neural networks in the setting of Riemannian geometry. From this perspective, the following theoretical results are developed and proven for feedforward networks. First it is shown that residual neural networks are finite difference approximations to dynamical systems of first order differential equations, as opposed to ordinary networks that are static. This implies that the network is learning systems of differential equations governing the coordinate transformations that represent the data. Second it is shown that a closed form solution of the metric tensor on the underlying data manifold can be found by backpropagating the coordinate representations learned by the neural network itself. This is formulated in a formal abstract sense as a sequence of Lie group actions on the metric fibre space in the principal and associated bundles on the data manifold. Toy experiments were run to confirm parts of the proposed theory, as well as to provide intuitions as to how neural networks operate on data.

UR - http://www.scopus.com/inward/record.url?scp=85046577806&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85046577806&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85046577806

VL - 2017-December

SP - 2808

EP - 2817

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -