TY - JOUR

T1 - Relu deep neural networks and linear finite elements

AU - He, Juncai

AU - Li, Lin

AU - Xu, Jinchao

AU - Zheng, Chunyue

N1 - Funding Information:
Acknowledgments. This work is partially supported by Beijing International Center for Mathematical Research, the Elite Program of Computational and Applied Mathematics for PhD Candidates of Peking University, NSFC Grant 91430215, NSF Grants DMS-1522615 and DMS-1819157.
Publisher Copyright:
© 2020 Global Science Press. All rights reserved.

PY - 2020

Y1 - 2020

N2 - In this paper, we investigate the relationship between deep neural networks (DNN) with rectified linear unit (ReLU) function as the activation function and continuous piecewise linear (CPWL) functions, especially CPWL functions from the simplicial linear finite element method (FEM). We first consider the special case of FEM. By exploring the DNN representation of its nodal basis functions, we present a ReLU DNN representation of CPWL in FEM. We theoretically establish that at least 2 hidden layers are needed in a ReLU DNN to represent any linear finite element functions in Ω ⊆ Rd when d ≥ 2. Consequently, for d = 2, 3 which are often encountered in scientific and engineering computing, the minimal number of two hidden layers are necessary and sufficient for any CPWL function to be represented by a ReLU DNN. Then we include a detailed account on how a general CPWL in Rd can be represented by a ReLU DNN with at most ⌈log2(d + 1)⌉ hidden layers and we also give an estimation of the number of neurons in DNN that are needed in such a representation. Furthermore, using the relationship between DNN and FEM, we theoretically argue that a special class of DNN models with low bit-width are still expected to have an adequate representation power in applications. Finally, as a proof of concept, we present some numerical results for using ReLU DNNs to solve a two point boundary problem to demonstrate the potential of applying DNN for numerical solution of partial differential equations.

AB - In this paper, we investigate the relationship between deep neural networks (DNN) with rectified linear unit (ReLU) function as the activation function and continuous piecewise linear (CPWL) functions, especially CPWL functions from the simplicial linear finite element method (FEM). We first consider the special case of FEM. By exploring the DNN representation of its nodal basis functions, we present a ReLU DNN representation of CPWL in FEM. We theoretically establish that at least 2 hidden layers are needed in a ReLU DNN to represent any linear finite element functions in Ω ⊆ Rd when d ≥ 2. Consequently, for d = 2, 3 which are often encountered in scientific and engineering computing, the minimal number of two hidden layers are necessary and sufficient for any CPWL function to be represented by a ReLU DNN. Then we include a detailed account on how a general CPWL in Rd can be represented by a ReLU DNN with at most ⌈log2(d + 1)⌉ hidden layers and we also give an estimation of the number of neurons in DNN that are needed in such a representation. Furthermore, using the relationship between DNN and FEM, we theoretically argue that a special class of DNN models with low bit-width are still expected to have an adequate representation power in applications. Finally, as a proof of concept, we present some numerical results for using ReLU DNNs to solve a two point boundary problem to demonstrate the potential of applying DNN for numerical solution of partial differential equations.

UR - http://www.scopus.com/inward/record.url?scp=85088300639&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85088300639&partnerID=8YFLogxK

U2 - 10.4208/JCM.1901-M2018-0160

DO - 10.4208/JCM.1901-M2018-0160

M3 - Article

AN - SCOPUS:85088300639

VL - 38

SP - 502

EP - 527

JO - Journal of Computational Mathematics

JF - Journal of Computational Mathematics

SN - 0254-9409

IS - 3

ER -