default search action
Yin Tat Lee
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2024
- [j12]Manru Zong, Yin Tat Lee, Man-Chung Yue:
Short-step methods are not strongly polynomial-time. Math. Program. 207(1): 733-746 (2024) - [c85]Chulin Xie, Zinan Lin, Arturs Backurs, Sivakanth Gopi, Da Yu, Huseyin A. Inan, Harsha Nori, Haotian Jiang, Huishuai Zhang, Yin Tat Lee, Bo Li, Sergey Yekhanin:
Differentially Private Synthetic Data via Foundation Model APIs 2: Text. ICML 2024 - [c84]Haotian Jiang, Yin Tat Lee, Zhao Song, Lichen Zhang:
Convex Minimization with Integer Minima in Õ(n4) Time. SODA 2024: 3659-3684 - [c83]Mehrdad Ghadiri, Yin Tat Lee, Swati Padmanabhan, William Swartworth, David P. Woodruff, Guanghao Ye:
Improving the Bit Complexity of Communication for Distributed Convex Optimization. STOC 2024: 1130-1140 - [i93]Chulin Xie, Zinan Lin, Arturs Backurs, Sivakanth Gopi, Da Yu, Huseyin A. Inan, Harsha Nori, Haotian Jiang, Huishuai Zhang, Yin Tat Lee, Bo Li, Sergey Yekhanin:
Differentially Private Synthetic Data via Foundation Model APIs 2: Text. CoRR abs/2403.01749 (2024) - [i92]Mehrdad Ghadiri, Yin Tat Lee, Swati Padmanabhan, William Swartworth, David P. Woodruff, Guanghao Ye:
Improving the Bit Complexity of Communication for Distributed Convex Optimization. CoRR abs/2403.19146 (2024) - [i91]Marah I Abdin, Sam Ade Jacobs, Ammar Ahmad Awan, Jyoti Aneja, Ahmed Awadallah, Hany Awadalla, Nguyen Bach, Amit Bahree, Arash Bakhtiari, Harkirat S. Behl, Alon Benhaim, Misha Bilenko, Johan Bjorck, Sébastien Bubeck, Martin Cai, Caio César Teodoro Mendes, Weizhu Chen, Vishrav Chaudhary, Parul Chopra, Allie Del Giorno, Gustavo de Rosa, Matthew Dixon, Ronen Eldan, Dan Iter, Amit Garg, Abhishek Goswami, Suriya Gunasekar, Emman Haider, Junheng Hao, Russell J. Hewett, Jamie Huynh, Mojan Javaheripi, Xin Jin, Piero Kauffmann, Nikos Karampatziakis, Dongwoo Kim, Mahoud Khademi, Lev Kurilenko, James R. Lee, Yin Tat Lee, Yuanzhi Li, Chen Liang, Weishung Liu, Eric Lin, Zeqi Lin, Piyush Madan, Arindam Mitra, Hardik Modi, Anh Nguyen, Brandon Norick, Barun Patra, Daniel Perez-Becker, Thomas Portet, Reid Pryzant, Heyang Qin, Marko Radmilac, Corby Rosset, Sambudha Roy, Olatunji Ruwase, Olli Saarikivi, Amin Saied, Adil Salim, Michael Santacroce, Shital Shah, Ning Shang, Hiteshi Sharma, Xia Song, Masahiro Tanaka, Xin Wang, Rachel Ward, Guanhua Wang, Philipp Witte, Michael Wyatt, Can Xu, Jiahang Xu, Sonali Yadav, Fan Yang, Ziyi Yang, Donghan Yu, Chengruidong Zhang, Cyril Zhang, Jianwen Zhang, Li Lyna Zhang, Yi Zhang, Yue Zhang, Yunan Zhang, Xiren Zhou:
Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone. CoRR abs/2404.14219 (2024) - 2023
- [c82]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler. COLT 2023: 2399-2439 - [c81]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Condition-number-independent Convergence Rate of Riemannian Hamiltonian Monte Carlo with Numerical Integrators. COLT 2023: 4504-4569 - [c80]Reid Pryzant, Dan Iter, Jerry Li, Yin Tat Lee, Chenguang Zhu, Michael Zeng:
Automatic Prompt Optimization with "Gradient Descent" and Beam Search. EMNLP 2023: 7957-7968 - [c79]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. FOCS 2023: 2031-2058 - [c78]Jiyan He, Xuechen Li, Da Yu, Huishuai Zhang, Janardhan Kulkarni, Yin Tat Lee, Arturs Backurs, Nenghai Yu, Jiang Bian:
Exploring the Limits of Differentially Private Deep Learning with Group-wise Clipping. ICLR 2023 - [c77]Kwangjun Ahn, Sébastien Bubeck, Sinho Chewi, Yin Tat Lee, Felipe Suarez, Yi Zhang:
Learning threshold neurons via edge of stability. NeurIPS 2023 - [c76]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. SODA 2023: 5068-5089 - [c75]Sophie Huiberts, Yin Tat Lee, Xinzhi Zhang:
Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method. STOC 2023: 1904-1917 - [i90]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. CoRR abs/2301.00457 (2023) - [i89]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Algorithmic Aspects of the Log-Laplace Transform and a Non-Euclidean Proximal Sampler. CoRR abs/2302.06085 (2023) - [i88]Yangsibo Huang, Daogao Liu, Zexuan Zhong, Weijia Shi, Yin Tat Lee:
kNN-Adapter: Efficient Domain Adaptation for Black-Box Language Models. CoRR abs/2302.10879 (2023) - [i87]Sébastien Bubeck, Varun Chandrasekaran, Ronen Eldan, Johannes Gehrke, Eric Horvitz, Ece Kamar, Peter Lee, Yin Tat Lee, Yuanzhi Li, Scott M. Lundberg, Harsha Nori, Hamid Palangi, Marco Túlio Ribeiro, Yi Zhang:
Sparks of Artificial General Intelligence: Early experiments with GPT-4. CoRR abs/2303.12712 (2023) - [i86]Haotian Jiang, Yin Tat Lee, Zhao Song, Lichen Zhang:
Convex Minimization with Integer Minima in Õ(n4) Time. CoRR abs/2304.03426 (2023) - [i85]Reid Pryzant, Dan Iter, Jerry Li, Yin Tat Lee, Chenguang Zhu, Michael Zeng:
Automatic Prompt Optimization with "Gradient Descent" and Beam Search. CoRR abs/2305.03495 (2023) - [i84]Yiran Wu, Feiran Jia, Shaokun Zhang, Hangyu Li, Erkang Zhu, Yue Wang, Yin Tat Lee, Richard Peng, Qingyun Wu, Chi Wang:
An Empirical Study on Challenging Math Problem Solving with GPT-4. CoRR abs/2306.01337 (2023) - [i83]Suriya Gunasekar, Yi Zhang, Jyoti Aneja, Caio César Teodoro Mendes, Allie Del Giorno, Sivakanth Gopi, Mojan Javaheripi, Piero Kauffmann, Gustavo de Rosa, Olli Saarikivi, Adil Salim, Shital Shah, Harkirat Singh Behl, Xin Wang, Sébastien Bubeck, Ronen Eldan, Adam Tauman Kalai, Yin Tat Lee, Yuanzhi Li:
Textbooks Are All You Need. CoRR abs/2306.11644 (2023) - [i82]Yuanzhi Li, Sébastien Bubeck, Ronen Eldan, Allie Del Giorno, Suriya Gunasekar, Yin Tat Lee:
Textbooks Are All You Need II: phi-1.5 technical report. CoRR abs/2309.05463 (2023) - [i81]Ruoqi Shen, Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Yuanzhi Li, Yi Zhang:
Positional Description Matters for Transformers Arithmetic. CoRR abs/2311.14737 (2023) - [i80]Harsha Nori, Yin Tat Lee, Sheng Zhang, Dean Carignan, Richard Edgar, Nicolò Fusi, Nicholas King, Jonathan Larson, Yuanzhi Li, Weishung Liu, Renqian Luo, Scott Mayer McKinney, Robert Osazuwa Ness, Hoifung Poon, Tao Qin, Naoto Usuyama, Chris White, Eric Horvitz:
Can Generalist Foundation Models Outcompete Special-Purpose Tuning? Case Study in Medicine. CoRR abs/2311.16452 (2023) - 2022
- [j11]Yin Tat Lee, Santosh S. Vempala:
Geodesic Walks in Polytopes. SIAM J. Comput. 51(2): 17-400 (2022) - [c74]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. COLT 2022: 1948-1989 - [c73]Yin Tat Lee, Santosh S. Vempala:
The Manifold Joys of Sampling (Invited Talk). ICALP 2022: 4:1-4:20 - [c72]Da Yu, Saurabh Naik, Arturs Backurs, Sivakanth Gopi, Huseyin A. Inan, Gautam Kamath, Janardhan Kulkarni, Yin Tat Lee, Andre Manoel, Lukas Wutschitz, Sergey Yekhanin, Huishuai Zhang:
Differentially Private Fine-tuning of Language Models. ICLR 2022 - [c71]Damek Davis, Dmitriy Drusvyatskiy, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensions. NeurIPS 2022 - [c70]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. NeurIPS 2022 - [c69]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. NeurIPS 2022 - [c68]Xuechen Li, Daogao Liu, Tatsunori B. Hashimoto, Huseyin A. Inan, Janardhan Kulkarni, Yin Tat Lee, Abhradeep Guha Thakurta:
When Does Differentially Private Learning Not Suffer in High Dimensions? NeurIPS 2022 - [c67]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. SODA 2022: 124-153 - [c66]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. SODA 2022: 2723-2742 - [c65]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster maxflow via improved dynamic spectral vertex sparsifiers. STOC 2022: 543-556 - [i79]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Sampling with Riemannian Hamiltonian Monte Carlo in a Constrained Space. CoRR abs/2202.01908 (2022) - [i78]Sivakanth Gopi, Yin Tat Lee, Daogao Liu:
Private Convex Optimization via Exponential Mechanism. CoRR abs/2203.00263 (2022) - [i77]Sally Dong, Yu Gao, Gramoz Goranci, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Guanghao Ye:
Nested Dissection Meets IPMs: Planar Min-Cost Flow in Nearly-Linear Time. CoRR abs/2205.01562 (2022) - [i76]Xuechen Li, Daogao Liu, Tatsunori Hashimoto, Huseyin A. Inan, Janardhan Kulkarni, Yin Tat Lee, Abhradeep Guha Thakurta:
When Does Differentially Private Learning Not Suffer in High Dimensions? CoRR abs/2207.00160 (2022) - [i75]Sivakanth Gopi, Yin Tat Lee, Daogao Liu, Ruoqi Shen, Kevin Tian:
Private Convex Optimization in General Norms. CoRR abs/2207.08347 (2022) - [i74]Sally Dong, Haotian Jiang, Yin Tat Lee, Swati Padmanabhan, Guanghao Ye:
Decomposable Non-Smooth Convex Optimization with Nearly-Linear Gradient Oracle Complexity. CoRR abs/2208.03811 (2022) - [i73]Arun Jambulapati, Yin Tat Lee, Santosh S. Vempala:
A Slightly Improved Bound for the KLS Constant. CoRR abs/2208.11644 (2022) - [i72]Yunbum Kook, Yin Tat Lee, Ruoqi Shen, Santosh S. Vempala:
Condition-number-independent Convergence Rate of Riemannian Hamiltonian Monte Carlo with Numerical Integrators. CoRR abs/2210.07219 (2022) - [i71]Sophie Huiberts, Yin Tat Lee, Xinzhi Zhang:
Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method. CoRR abs/2211.11860 (2022) - [i70]Jiyan He, Xuechen Li, Da Yu, Huishuai Zhang, Janardhan Kulkarni, Yin Tat Lee, Arturs Backurs, Nenghai Yu, Jiang Bian:
Exploring the Limits of Differentially Private Deep Learning with Group-wise Clipping. CoRR abs/2212.01539 (2022) - [i69]Kwangjun Ahn, Sébastien Bubeck, Sinho Chewi, Yin Tat Lee, Felipe Suarez, Yi Zhang:
Learning threshold neurons via the "edge of stability". CoRR abs/2212.07469 (2022) - 2021
- [j10]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving Linear Programs in the Current Matrix Multiplication Time. J. ACM 68(1): 3:1-3:39 (2021) - [j9]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee:
Kernel-based Methods for Bandit Convex Optimization. J. ACM 68(4): 25:1-25:35 (2021) - [j8]Yin Tat Lee, Man-Chung Yue:
Universal Barrier Is n-Self-Concordant. Math. Oper. Res. 46(3): 1129-1148 (2021) - [j7]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical Task Systems on Trees via Mirror Descent and Unfair Gluing. SIAM J. Comput. 50(3): 909-923 (2021) - [c64]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Structured Logconcave Sampling with a Restricted Gaussian Oracle. COLT 2021: 2993-3050 - [c63]Janardhan Kulkarni, Yin Tat Lee, Daogao Liu:
Private Non-smooth ERM and SCO in Subquadratic Steps. NeurIPS 2021: 4053-4064 - [c62]Sivakanth Gopi, Yin Tat Lee, Lukas Wutschitz:
Numerical Composition of Differential Privacy. NeurIPS 2021: 11631-11642 - [c61]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions. NeurIPS 2021: 18812-18824 - [c60]Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, Uthaipon Tantipongpipat:
Fast and Memory Efficient Differentially Private-SGD via JL Projections. NeurIPS 2021: 19680-19691 - [c59]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum cost flows, MDPs, and ℓ1-regression in nearly linear time for dense instances. STOC 2021: 859-869 - [c58]He Jia, Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Reducing isotropy and volume to KLS: an o*(n3ψ2) volume algorithm. STOC 2021: 961-974 - [c57]Sally Dong, Yin Tat Lee, Guanghao Ye:
A nearly-linear time algorithm for linear programs with small treewidth: a multiscale representation of robust central path. STOC 2021: 1784-1797 - [i68]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum Cost Flows, MDPs, and 𝓁1-Regression in Nearly Linear Time for Dense Instances. CoRR abs/2101.05719 (2021) - [i67]Zhiqi Bu, Sivakanth Gopi, Janardhan Kulkarni, Yin Tat Lee, Judy Hanwen Shen, Uthaipon Tantipongpipat:
Fast and Memory Efficient Differentially Private-SGD via JL Projections. CoRR abs/2102.03013 (2021) - [i66]Janardhan Kulkarni, Yin Tat Lee, Daogao Liu:
Private Non-smooth Empirical Risk Minimization and Stochastic Convex Optimization in Subquadratic Steps. CoRR abs/2103.15352 (2021) - [i65]Sivakanth Gopi, Yin Tat Lee, Lukas Wutschitz:
Numerical Composition of Differential Privacy. CoRR abs/2106.02848 (2021) - [i64]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Lower Bounds on Metropolized Sampling Methods for Well-Conditioned Distributions. CoRR abs/2106.05480 (2021) - [i63]Yin Tat Lee, Santosh S. Vempala:
Tutorial on the Robust Interior Point Method. CoRR abs/2108.04734 (2021) - [i62]Da Yu, Saurabh Naik, Arturs Backurs, Sivakanth Gopi, Huseyin A. Inan, Gautam Kamath, Janardhan Kulkarni, Yin Tat Lee, Andre Manoel, Lukas Wutschitz, Sergey Yekhanin, Huishuai Zhang:
Differentially Private Fine-tuning of Language Models. CoRR abs/2110.06500 (2021) - [i61]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. CoRR abs/2110.15563 (2021) - [i60]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster Maxflow via Improved Dynamic Spectral Vertex Sparsifiers. CoRR abs/2112.00722 (2021) - 2020
- [j6]Yin Tat Lee, Marcin Pilipczuk, David P. Woodruff:
Introduction to the Special Issue on SODA'18. ACM Trans. Algorithms 16(1): 1:1-1:2 (2020) - [c56]Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford:
Leverage Score Sampling for Faster Accelerated Regression and ERM. ALT 2020: 22-47 - [c55]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. COLT 2020: 2565-2597 - [c54]Yin Tat Lee, Swati Padmanabhan:
An $\widetilde\mathcalO(m/\varepsilon^3.5)$-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. COLT 2020: 3069-3119 - [c53]Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song:
A Faster Interior Point Method for Semidefinite Programming. FOCS 2020: 910-918 - [c52]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. FOCS 2020: 919-930 - [c51]Yin Tat Lee:
Convex Optimization and Dynamic Data Structure (Invited Talk). FSTTCS 2020: 3:1-3:1 - [c50]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer:
Network size and size of the weights in memorization with two-layers neural networks. NeurIPS 2020 - [c49]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. NeurIPS 2020 - [c48]Marek Eliás, Michael Kapralov, Janardhan Kulkarni, Yin Tat Lee:
Differentially Private Release of Synthetic Graphs. SODA 2020: 560-578 - [c47]Sébastien Bubeck, Bo'az Klartag, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Chasing Nested Convex Bodies Nearly Optimally. SODA 2020: 1496-1508 - [c46]Sally Dong, Yin Tat Lee, Kent Quanrud:
Computing Circle Packing Representations of Planar Graphs. SODA 2020: 2860-2875 - [c45]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving tall dense linear programs in nearly linear time. STOC 2020: 775-788 - [c44]Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian:
Positive semidefinite programming: mixed, parallel, and width-independent. STOC 2020: 789-802 - [c43]Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong:
An improved cutting plane method for convex optimization, convex-concave games, and its applications. STOC 2020: 944-953 - [c42]Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Strong self-concordance and sampling. STOC 2020: 1212-1222 - [i59]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving Tall Dense Linear Programs in Nearly Linear Time. CoRR abs/2002.02304 (2020) - [i58]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo. CoRR abs/2002.04121 (2020) - [i57]Arun Jambulapati, Yin Tat Lee, Jerry Li, Swati Padmanabhan, Kevin Tian:
Positive Semidefinite Programming: Mixed, Parallel, and Width-Independent. CoRR abs/2002.04830 (2020) - [i56]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. CoRR abs/2003.08078 (2020) - [i55]Haotian Jiang, Yin Tat Lee, Zhao Song, Sam Chiu-wai Wong:
An Improved Cutting Plane Method for Convex Optimization, Convex-Concave Games and its Applications. CoRR abs/2004.04250 (2020) - [i54]Sébastien Bubeck, Ronen Eldan, Yin Tat Lee, Dan Mikulincer:
Network size and weights size for memorization with two-layers neural networks. CoRR abs/2006.02855 (2020) - [i53]Ruoqi Shen, Kevin Tian, Yin Tat Lee:
Composite Logconcave Sampling with a Restricted Gaussian Oracle. CoRR abs/2006.05976 (2020) - [i52]He Jia, Aditi Laddha, Yin Tat Lee, Santosh S. Vempala:
Reducing Isotropy and Volume to KLS: An O(n3ψ2) Volume Algorithm. CoRR abs/2008.02146 (2020) - [i51]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. CoRR abs/2009.01802 (2020) - [i50]Haotian Jiang, Tarun Kathuria, Yin Tat Lee, Swati Padmanabhan, Zhao Song:
A Faster Interior Point Method for Semidefinite Programming. CoRR abs/2009.10217 (2020) - [i49]Yin Tat Lee, Ruoqi Shen, Kevin Tian:
Structured Logconcave Sampling with a Restricted Gaussian Oracle. CoRR abs/2010.03106 (2020) - [i48]Sally Dong, Yin Tat Lee, Guanghao Ye:
A Nearly-Linear Time Algorithm for Linear Programs with Small Treewidth: A Multiscale Representation of Robust Central Path. CoRR abs/2011.05365 (2020)
2010 – 2019
- 2019
- [j5]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Convergence Rates for Convex Distributed Optimization in Networks. J. Mach. Learn. Res. 20: 159:1-159:31 (2019) - [c41]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near-optimal method for highly smooth convex optimization. COLT 2019: 492-507 - [c40]Michael B. Cohen, Ben Cousins, Yin Tat Lee, Xin Yang:
A near-optimal algorithm for approximating the John Ellipsoid. COLT 2019: 849-873 - [c39]Alexander V. Gasnikov, Pavel E. Dvurechensky, Eduard Gorbunov, Evgeniya A. Vorontsova, Daniil Selikhanovych, César A. Uribe, Bo Jiang, Haoyue Wang, Shuzhong Zhang, Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives. COLT 2019: 1392-1393 - [c38]Yin Tat Lee, Zhao Song, Qiuyi Zhang:
Solving Empirical Risk Minimization in the Current Matrix Multiplication Time. COLT 2019: 2140-2157 - [c37]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sahil Singla, Sam Chiu-wai Wong:
Faster Matroid Intersection. FOCS 2019: 1146-1168 - [c36]Sébastien Bubeck, Yin Tat Lee, Eric Price, Ilya P. Razenshteyn:
Adversarial examples from computational constraints. ICML 2019: 831-840 - [c35]Ruoqi Shen, Yin Tat Lee:
The Randomized Midpoint Method for Log-Concave Sampling. NeurIPS 2019: 2098-2109 - [c34]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. NeurIPS 2019: 13900-13909 - [c33]Sébastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee:
Metrical task systems on trees via mirror descent and unfair gluing. SODA 2019: 89-97 - [c32]C. J. Argue, Sébastien Bubeck, Michael B. Cohen, Anupam Gupta, Yin Tat Lee:
A Nearly-Linear Bound for Chasing Nested Convex Bodies. SODA 2019: 117-122 - [c31]Sébastien Bubeck, Yin Tat Lee, Yuanzhi Li, Mark Sellke:
Competitively chasing convex bodies. STOC 2019: 861-868 - [c30]Michael B. Cohen, Yin Tat Lee, Zhao Song:
Solving linear programs in the current matrix multiplication time. STOC 2019: 938-942 - [i47]Yin Tat Lee, Swati Padmanabhan:
An Õ(m/ε3.5)-Cost Algorithm for Semidefinite Programs with Diagonal Constraints. CoRR abs/1903.01859 (2019) - [i46]Yin Tat Lee, Zhao Song, Qiuyi Zhang:
Solving Empirical Risk Minimization in the Current Matrix Multiplication Time. CoRR abs/1905.04447 (2019) - [i45]Michael B. Cohen, Ben Cousins, Yin Tat Lee, Xin Yang:
A near-optimal algorithm for approximating the John Ellipsoid. CoRR abs/1905.11580 (2019) - [i44]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. CoRR abs/1906.10655 (2019) - [i43]