Sign Up
View map

Title: Distributed Memory Subtensor Parallel Tensor Train Cross
Presenter: Dr. Dan Hayes
Abstract: When working with large data sets, the curse of dimensionality places tight restrictions on storage requirements as well as performing practical computations. In recent years, significant work has been made towards dealing with these issues in various forms of tensor decompositions. In this work, we propose an algorithm for a subtensor parallel Tensor Train Cross (TT - Cross) decomposition which is fit for a distributed memory setting. We will show that this algorithm maintains a low storage requirement, as well as a small communication cost throughout its stages. This facilitates the effective use of large computing systems, which will utilize local tensor information to construct a global TT - Cross approximation. Numerical tests will be presented to display the scaling results as well as storage requirements in a synthetic benchmark, as well as a tensor that arises in PDE simulations.

Event Details

See Who Is Interested

0 people are interested in this event

User Activity

No recent activity