default search action
Haishan Ye
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [j12]Jun Shang, Haishan Ye, Xiangyu Chang:
Accelerated Double-Sketching Subspace Newton. Eur. J. Oper. Res. 319(2): 484-493 (2024) - [c14]Lesi Chen, Haishan Ye, Luo Luo:
An Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization. AISTATS 2024: 1990-1998 - [c13]Jun Chen, Haishan Ye, Mengmeng Wang, Tianxin Huang, Guang Dai, Ivor W. Tsang, Yong Liu:
Decentralized Riemannian Conjugate Gradient Method on the Stiefel Manifold. ICLR 2024 - [c12]Hao Di, Haishan Ye, Xiangyu Chang, Guang Dai, Ivor W. Tsang:
Double Stochasticity Gazes Faster: Snap-Shot Decentralized Stochastic Gradient Tracking Methods. ICML 2024 - [c11]Hao Di, Haishan Ye, Yueling Zhang, Xiangyu Chang, Guang Dai, Ivor W. Tsang:
Double Variance Reduction: A Smoothing Trick for Composite Optimization Problems without First-Order Gradient. ICML 2024 - [c10]Yilong Wang, Haishan Ye, Guang Dai, Ivor W. Tsang:
Can Gaussian Sketching Converge Faster on a Preconditioned Landscape? ICML 2024 - [i26]Yanjun Zhao, Sizhe Dang, Haishan Ye, Guang Dai, Yi Qian, Ivor W. Tsang:
Second-Order Fine-Tuning without Pain for LLMs: A Hessian Informed Zeroth-Order Optimizer. CoRR abs/2402.15173 (2024) - [i25]Qihao Zhou, Haishan Ye, Luo Luo:
Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity. CoRR abs/2405.16126 (2024) - [i24]Hao Di, Haishan Ye, Yueling Zhang, Xiangyu Chang, Guang Dai, Ivor W. Tsang:
Double Variance Reduction: A Smoothing Trick for Composite Optimization Problems without First-Order Gradient. CoRR abs/2405.17761 (2024) - 2023
- [j11]Haishan Ye:
Intelligent Image Processing Technology for Badminton Robot under Machine Vision of Internet of Things. Int. J. Humanoid Robotics 20(6): 2250018:1-2250018:26 (2023) - [j10]Haishan Ye, Luo Luo, Ziang Zhou, Tong Zhang:
Multi-Consensus Decentralized Accelerated Gradient Descent. J. Mach. Learn. Res. 24: 306:1-306:50 (2023) - [j9]Haishan Ye, Dachao Lin, Xiangyu Chang, Zhihua Zhang:
Towards explicit superlinear convergence rate for SR1. Math. Program. 199(1): 1273-1303 (2023) - [j8]Haishan Ye, Chaoyang He, Xiangyu Chang:
Accelerated Distributed Approximate Newton Method. IEEE Trans. Neural Networks Learn. Syst. 34(11): 8642-8653 (2023) - [c9]Dachao Lin, Yuze Han, Haishan Ye, Zhihua Zhang:
Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis. NeurIPS 2023 - [i23]Dachao Lin, Yuze Han, Haishan Ye, Zhihua Zhang:
Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis. CoRR abs/2304.07504 (2023) - [i22]Haishan Ye:
Mirror Natural Evolution Strategies. CoRR abs/2308.00469 (2023) - [i21]Jun Chen, Haishan Ye, Mengmeng Wang, Tianxin Huang, Guang Dai, Ivor W. Tsang, Yong Liu:
Decentralized Riemannian Conjugate Gradient Method on the Stiefel Manifold. CoRR abs/2308.10547 (2023) - [i20]Hao Di, Yi Yang, Haishan Ye, Xiangyu Chang:
PPFL: A Personalized Federated Learning Framework for Heterogeneous Population. CoRR abs/2310.14337 (2023) - 2022
- [j7]Dachao Lin, Haishan Ye, Zhihua Zhang:
Explicit Convergence Rates of Greedy and Random Quasi-Newton Methods. J. Mach. Learn. Res. 23: 162:1-162:40 (2022) - [c8]Rui Pan, Haishan Ye, Tong Zhang:
Eigencurve: Optimal Learning Rate Schedule for SGD on Quadratic Objectives with Skewed Hessian Spectrums. ICLR 2022 - [i19]Luo Luo, Haishan Ye:
Decentralized Stochastic Variance Reduced Extragradient Method. CoRR abs/2202.00509 (2022) - [i18]Luo Luo, Haishan Ye:
An Optimal Stochastic Algorithm for Decentralized Nonconvex Finite-sum Optimization. CoRR abs/2210.13931 (2022) - [i17]Lesi Chen, Haishan Ye, Luo Luo:
A Simple and Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization. CoRR abs/2212.02387 (2022) - 2021
- [j6]Haishan Ye, Luo Luo, Zhihua Zhang:
Approximate Newton Methods. J. Mach. Learn. Res. 22: 66:1-66:41 (2021) - [j5]Haishan Ye, Tong Zhang:
DeEPCA: Decentralized Exact PCA with Linear Convergence Rate. J. Mach. Learn. Res. 22: 238:1-238:27 (2021) - [j4]Haishan Ye, Luo Luo, Zhihua Zhang:
Accelerated Proximal Subsampled Newton Method. IEEE Trans. Neural Networks Learn. Syst. 32(10): 4374-4388 (2021) - [c7]Luo Luo, Cheng Chen, Guangzeng Xie, Haishan Ye:
Revisiting Co-Occurring Directions: Sharper Analysis and Efficient Algorithm for Sparse Matrices. AAAI 2021: 8793-8800 - [c6]Dachao Lin, Haishan Ye, Zhihua Zhang:
Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence. NeurIPS 2021: 6646-6657 - [i16]Haishan Ye, Tong Zhang:
DeEPCA: Decentralized Exact PCA with Linear Convergence Rate. CoRR abs/2102.03990 (2021) - [i15]Haishan Ye, Dachao Lin, Zhihua Zhang:
Greedy and Random Broyden's Methods with Explicit Superlinear Convergence Rates in Nonlinear Equations. CoRR abs/2110.08572 (2021) - [i14]Rui Pan, Haishan Ye, Tong Zhang:
Eigencurve: Optimal Learning Rate Schedule for SGD on Quadratic Objectives with Skewed Hessian Spectrums. CoRR abs/2110.14109 (2021) - 2020
- [j3]Haishan Ye, Luo Luo, Zhihua Zhang:
Nesterov's Acceleration for Approximate Newton. J. Mach. Learn. Res. 21: 142:1-142:37 (2020) - [c5]Chaoyang He, Haishan Ye, Li Shen, Tong Zhang:
MiLeNAS: Efficient Neural Architecture Search via Mixed-Level Reformulation. CVPR 2020: 11990-11999 - [c4]Luo Luo, Haishan Ye, Zhichao Huang, Tong Zhang:
Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems. NeurIPS 2020 - [c3]Haishan Ye, Ziang Zhou, Luo Luo, Tong Zhang:
Decentralized Accelerated Proximal Gradient Descent. NeurIPS 2020 - [i13]Luo Luo, Haishan Ye, Tong Zhang:
Stochastic Recursive Gradient Descent Ascent for Stochastic Nonconvex-Strongly-Concave Minimax Problems. CoRR abs/2001.03724 (2020) - [i12]Chaoyang He, Haishan Ye, Li Shen, Tong Zhang:
MiLeNAS: Efficient Neural Architecture Search via Mixed-Level Reformulation. CoRR abs/2003.12238 (2020) - [i11]Haishan Ye, Luo Luo, Ziang Zhou, Tong Zhang:
Multi-consensus Decentralized Accelerated Gradient Descent. CoRR abs/2005.00797 (2020) - [i10]Luo Luo, Cheng Chen, Guangzeng Xie, Haishan Ye:
Revisiting Co-Occurring Directions: Sharper Analysis and Efficient Algorithm for Sparse Matrices. CoRR abs/2009.02553 (2020) - [i9]Haishan Ye, Wei Xiong, Tong Zhang:
PMGT-VR: A decentralized proximal-gradient algorithmic framework with variance reduction. CoRR abs/2012.15010 (2020)
2010 – 2019
- 2019
- [j2]Haishan Ye, Guangzeng Xie, Luo Luo, Zhihua Zhang:
Fast stochastic second-order method logarithmic in condition number. Pattern Recognit. 88: 629-642 (2019) - [i8]Haishan Ye, Tong Zhang:
Mirror Natural Evolution Strategies. CoRR abs/1910.11490 (2019) - [i7]Haishan Ye, Shusen Wang, Zhihua Zhang, Tong Zhang:
Fast Generalized Matrix Regression with Applications in Machine Learning. CoRR abs/1912.12008 (2019) - 2018
- [i6]Haishan Ye, Zhichao Huang, Cong Fang, Chris Junchi Li, Tong Zhang:
Hessian-Aware Zeroth-Order Optimization for Black-Box Adversarial Attack. CoRR abs/1812.11377 (2018) - 2017
- [j1]Haishan Ye, Yujun Li, Cheng Chen, Zhihua Zhang:
Fast Fisher discriminant analysis with randomized algorithms. Pattern Recognit. 72: 82-92 (2017) - [c2]Haishan Ye, Luo Luo, Zhihua Zhang:
Approximate Newton Methods and Their Local Convergence. ICML 2017: 3931-3939 - [i5]Haishan Ye, Luo Luo, Zhihua Zhang:
A Unifying Framework for Convergence Analysis of Approximate Newton Methods. CoRR abs/1702.08124 (2017) - [i4]Haishan Ye, Zhihua Zhang:
Nesterov's Acceleration For Approximate Newton. CoRR abs/1710.08496 (2017) - 2016
- [c1]Yujun Li, Kaichun Mo, Haishan Ye:
Accelerating Random Kaczmarz Algorithm Based on Clustering Information. AAAI 2016: 1823-1829 - [i3]Haishan Ye, Luo Luo, Zhihua Zhang:
Revisiting Sub-sampled Newton Methods. CoRR abs/1608.02875 (2016) - [i2]Haishan Ye, Qiaoming Ye, Zhihua Zhang:
Tighter bound of Sketched Generalized Matrix Approximation. CoRR abs/1609.02258 (2016) - 2015
- [i1]Yujun Li, Kaichun Mo, Haishan Ye:
Accelerating Random Kaczmarz Algorithm Based on Clustering Information. CoRR abs/1511.05362 (2015)
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-10-07 02:35 CEST by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint