Paper Info

Title | ||
---|---|---|

Learning multidimensional Fourier series with tensor trains |

Abstract | ||
---|---|---|

How to learn a function from observations of inputs and noisy outputs is a fundamental problem in machine learning. Often, an approximation of the desired function is found by minimizing a risk functional over some function space. The space of candidate functions should contain good approximations of the true function, but it should also be such that the minimization of the risk functional is computationally feasible. In this paper, finite multidimensional Fourier series are used as candidate functions. Their impressive approximative capabilities are illustrated by showing that Gaussian-kernel estimators can be approximated arbitrarily well over any compact set of bandwidths with a fixed number of Fourier coefficients. However, the solution of the associated risk minimization problem is computationally feasible only if the dimension d of the inputs is small because the number of required Fourier coefficients grows exponentially with d. This problem is addressed by using the tensor train format to model the tensor of Fourier coefficients under a low-rank constraint. An algorithm for least-squares regression is derived and the potential of this approach is illustrated in numerical experiments. The computational complexity of the algorithm grows only linearly both with the number of observations N and the input dimension d, making it feasible also for large-scale problems. |

Year | DOI | Venue |
---|---|---|

2014 | 10.1109/GlobalSIP.2014.7032146 | Signal and Information Processing |

Keywords | Field | DocType |

Fourier series,Gaussian processes,computational complexity,functions,learning (artificial intelligence),least squares approximations,minimisation,regression analysis,tensors,Fourier coefficient,Gaussian-kernel estimator,associated risk minimization problem,candidate function,computational complexity,finite multidimensional Fourier series,function space,impressive approximative capability,learning multidimensional Fourier series,least-squares regression,low-rank constraint,machine learning,noisy output,risk functional,tensor train format,true function,Kernels,Large-Scale Learning,Low-Rank Constraints,Risk Minimization,Tensor Train Format | Kernel (linear algebra),Discrete-time Fourier transform,Applied mathematics,Function space,Mathematical optimization,Fourier analysis,Tensor,Fourier transform,Fourier series,Mathematics,Computational complexity theory | Conference |

Citations | PageRank | References |

1 | 0.34 | 9 |

Authors | ||

4 |

Authors (4 rows)

Cited by (1 rows)

References (9 rows)

Name | Order | Citations | PageRank |
---|---|---|---|

Sander Wahls | 1 | 58 | 17.32 |

Visa Koivunen | 2 | 1917 | 187.81 |

H. V. Poor | 3 | 25411 | 1951.66 |

Michel Verhaegen | 4 | 1074 | 140.85 |