Abstract
In this paper, we discuss a splitting method for group Lasso. By assuming that the sequence of the step lengths has positive lower bound and positive upper bound (unrelated to the given problem data), we prove its Q-linear rate of convergence of the distance sequence of the iterates to the solution set. Moreover, we make comparisons with convergence of the proximal gradient method analyzed very recently.
Similar content being viewed by others
References
Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B Stat. Methodol. 68, 49–67 (2006)
Friedman, J., Hastie, T., Höfling, H., Tibshirani, R.: Pathwise coordinate optimization. Ann. Appl. Stat. 1, 302–332 (2007)
Liu, J., Yuan, L., Ye, J.: An efficient algorithm for a class of fused Lasso problems. In: The ACM SIG Knowledge Discovery and Data Mining. ACM, Washington, DC (2010)
Bakin, S.: Adaptive Regression and Model Selection in Data Mining Problems. Ph.D. Thesis. Australian National University, Canberra (1999)
Tibshirani, R., Saunders, M.: Sparsity and smoothness via the fused lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 67, 91–108 (2005)
Tseng, P.: Approximation accuracy, gradient methods, and error bound for structured convex optimization. Math. Program. 125(2), 263–295 (2010)
Zhang, H.B., Wei, J., Li, M., Zhou, J., Chao, M.: On proximal gradient method for the convex problems regularized with the group reproducing kernel norm. J. Global Optim. 58, 169–188 (2014)
Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward–backward splitting. Multiscale Model Simul. 4, 1168–1200 (2005)
Wright, S., Nowak, R., Figueiredo, M.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 57(7), 2479–2493 (2009)
Hager, W.W., Phan, D.T., Zhang, H.C.: Gradient-based methods for sparse recovery. SIAM J. Imaging Sci. 4(1), 146–165 (2011)
Garrifos, G., Rosasco, L., Villa, S.: Convergence of the forward–backward algorithm: beyond the worst-case with the help of geometry. arxiv:1703.09477v2 (2017)
Luo, Z.Q., Tseng, P.: On the linear convergence of descent methods for convex essentially smooth minimization. SIAM J. Control Optim. 30(2), 408–425 (1992)
Zhang, H.B., Jiang, J., Luo, Z.Q.: On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems. J. Oper. Res. Soc. China 1(2), 163–186 (2013)
Minty, G.J.: On the monotonicity of the gradient of a convex function. Pac. J. Math. 14, 243–247 (1964)
Huang, Y.Y., Dong, Y.D.: New properties of forward-backward splitting and a practical proximal-descent algorithm. Appl. Math. Comput. 237, 60–68 (2014)
Dong, Y.D.: An LS-free splitting method for composite mappings. Appl. Math. Lett. 18, 843–848 (2005)
Irschara, A., Zach, C., Klopschitz, M., Bischof, H.: Large-scale, dense city reconstruction from user-contributed photos. Comput. Vis. Image Underst. 116, 2–15 (2012)
Alotaibi, A., Combettes, P.L., Shahzad, N.: Solving coupled composite monotone inclusions by successive Fejér approximations of their Kuhn–Tucker set. SIAM J. Optim. 24(4), 2076–2095 (2014)
Latafat, P., Patrinos, P.: Asymmetric forward-backward-adjoint splitting for solving monotone inclusions involving three operators. Comput. Optim. Appl. 68(1), 57–93 (2017)
Author information
Authors and Affiliations
Corresponding author
Additional information
This research was supported by the National Natural Science Foundation of China (No. 61179033), and Collaborative Innovation Center on Beijing Society-Building and Social Governance.
Rights and permissions
About this article
Cite this article
Dong, YD., Zhang, HB. & Gao, H. On Globally Q-Linear Convergence of a Splitting Method for Group Lasso. J. Oper. Res. Soc. China 6, 445–454 (2018). https://doi.org/10.1007/s40305-017-0176-0
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40305-017-0176-0