===firstname: Alex ===firstname3: Youssef ===affil6: ===lastname3: Marzouk ===email: goroda@mit.edu ===keyword_other2: ===lastname6: ===affil5: ===lastname4: ===lastname7: ===affil7: ===postal: Massachusetts Institute of Technology 77 Massachusetts Avenue, Room 37-312 Cambridge, MA 02139 ===ABSTRACT: The curse-of-dimensionality often arises in high-dimensional uncertainty quantification problems. For example, solving a stochastic partial differential equation (PDE) with tens to hundreds of stochastic variables is typically prohibitively expensive. Recent work on the factorization of high-dimensional arrays arising in such applications has shown great potential for alleviating this curse-of-dimensionality when the problems exhibit low-rank structure. Such factorizations, termed tensor decompositions, often allow for the solution of low-rank PDEs in linear time with dimension and polynomial time in rank. Furthermore, they enable fast post-processing of various quantities of interest through multilinear algebra methods that also scale linearly with dimension and polynomially with rank. In this talk we extend a tensor decomposition, the tensor-train, to the continuous case of multidimensional functions. This extension allows us to easily incorporate discretization adaptivity in our approximation by breaking away from the typical grids associated with tensor decompositions. Furthermore, the continuous tensor decomposition -- which we term the "function-train"-- provides a framework for post-processing UQ-related quantities that are discretized at different levels, thereby increasing the flexibility and applicability of these algorithms. We begin by describing a new cross-approximation algorithm for computing the CUR/skeleton decomposition of bivariate functions. We then extend this decomposition to the multidimensional case of the function-train decomposition. Computing the decomposition relies on continuous analogues of matrix factorizations such as continuous QR and LU factorizations of matrix-valued functions. We continue by describing a continuous, or functional, alternating least squares optimization algorithm for fast multilinear algebraic operations such as multiplication of low-rank functions and application of low-rank operators. The computational cost of these operations scales linearly with dimension and polynomially with rank. Finally, we we apply our methods to UQ related problems surrounding the modeling of stochastic elliptic partial differential equations. ===affil3: Massachusetts Institute of Technology ===title: Function-train: a continuous analogue of the tensor-train decomposition ===affil2: Massachusetts Institute of Technology ===lastname2: Karaman ===firstname4: ===keyword1: Surrogate modeling/model reduction ===workshop: no ===lastname: Gorodetsky ===firstname5: ===keyword2: Uncertainty quantification/PDEs with random data ===otherauths: ===affil4: ===competition: no ===firstname7: ===firstname6: ===keyword_other1: ===lastname5: ===affilother: ===firstname2: Sertac