Advertisement

Necessary Notations of the Proposed Method

  • Bin Shi
  • S. S. Iyengar
Chapter

Abstract

We define necessary notations and review important definitions that will be used later in our analysis. Let \(C^{2}(\mathbb {R}^{n})\) be the vector space of real-valued twice-continuously differentiable functions. Let ∇ be the gradient operator and ∇2 be the Hessian operator. Let ∥⋅∥2 be the Euclidean norm in \(\mathbb {R}^{n}\). Let μ be the Lebesgue measure in \(\mathbb {R}^n\).

Keywords

Lipschitz continuity Local minimizer Saddle point Gradient map Jacobian Hessian matrix Multivariate analysis Lebesgue measure 

References

  1. [GHJY15]
    R. Ge, F. Huang, C. Jin, Y. Yuan, Escaping from saddle points—online stochastic gradient for tensor decomposition, in Proceedings of the 28th Conference on Learning Theory (2015), pp. 797–842Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Bin Shi
    • 1
  • S. S. Iyengar
    • 2
  1. 1.University of CaliforniaBerkeleyUSA
  2. 2.Florida International UniversityMiamiUSA

Personalised recommendations