Functional time series with applications to neural networks
Sunday, Aug 4: 4:05 PM - 4:25 PM
Topic-Contributed Paper Session
Oregon Convention Center
Recent advances have generalized neural networks to learn operators, also referred to as neural operators. Neural operators map between possibly infinite dimensional function spaces and can be formulated as a composition of linear integral operators and nonlinear activation functions. This talks studies the distribution of such networks with random Gaussian weights and biases in which the hidden layer widths are proportional to a large constant. The used tools are based on a functional version of the Malliavin-Stein method.
You have unsaved changes.