default search action
Aaron Sidford
Person information
- affiliation: Stanford University, CA, USA
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2024
- [j11]Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Efficient Convex Optimization Requires Superlinear Memory. J. ACM 71(6): 41:1-41:37 (2024) - [j10]Jose H. Blanchet, Arun Jambulapati, Carson Kent, Aaron Sidford:
Towards optimal running timesfor optimal transport. Oper. Res. Lett. 52: 107054 (2024) - [j9]Tarun Kathuria, Yang P. Liu, Aaron Sidford:
Unit Capacity Maxflow in Almost $m^{4/3}$ Time. SIAM J. Comput. 53(6): S20-175 (2024) - [c109]Arun Jambulapati, Aaron Sidford, Kevin Tian:
Closing the Computational-Query Depth Gap in Parallel Stochastic Convex Optimization. COLT 2024: 2608-2643 - [c108]Yujia Jin, Ishani Karmarkar, Christopher Musco, Aaron Sidford, Apoorv Vikram Singh:
Faster Spectral Density Estimation and Sparsification in the Nuclear Norm (Extended Abstract). COLT 2024: 2722 - [c107]Jan van den Brand, Li Chen, Rasmus Kyng, Yang P. Liu, Richard Peng, Maximilian Probst Gutenberg, Sushant Sachdeva, Aaron Sidford:
Incremental Approximate Maximum Flow on Undirected Graphs in Subpolynomial Update Time. SODA 2024: 2980-2998 - [c106]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
A Whole New Ball Game: A Primal Accelerated Method for Matrix Games and Minimizing the Maximum of Smooth Functions. SODA 2024: 3685-3723 - [c105]Sayan Bhattacharya, Peter Kiss, Aaron Sidford, David Wajc:
Near-Optimal Dynamic Rounding of Fractional Matchings in Bipartite Graphs. STOC 2024: 59-70 - [c104]Arun Jambulapati, James R. Lee, Yang P. Liu, Aaron Sidford:
Sparsifying Generalized Linear Models. STOC 2024: 1665-1675 - [i115]Simon Apers, Sander Gribling, Aaron Sidford:
On computing approximate Lewis weights. CoRR abs/2404.02881 (2024) - [i114]Yujia Jin, Ishani Karmarkar, Aaron Sidford, Jiayi Wang:
Truncated Variance Reduced Value Iteration. CoRR abs/2405.12952 (2024) - [i113]Arun Jambulapati, Aaron Sidford, Kevin Tian:
Closing the Computational-Query Depth Gap in Parallel Stochastic Convex Optimization. CoRR abs/2406.07373 (2024) - [i112]Yujia Jin, Ishani Karmarkar, Christopher Musco, Aaron Sidford, Apoorv Vikram Singh:
Faster Spectral Density Estimation and Sparsification in the Nuclear Norm. CoRR abs/2406.07521 (2024) - [i111]Arun Jambulapati, Sushant Sachdeva, Aaron Sidford, Kevin Tian, Yibin Zhao:
Eulerian Graph Sparsification by Effective Resistance Decomposition. CoRR abs/2408.10172 (2024) - [i110]Aaron Bernstein, Jiale Chen, Aditi Dudeja, Zachary Langley, Aaron Sidford, Ta-Wei Tu:
Matching Composition and Efficient Weight Reduction in Dynamic Matching. CoRR abs/2410.18936 (2024) - [i109]Deeksha Adil, Brian Bullins, Arun Jambulapati, Aaron Sidford:
Convex optimization with p-norm oracles. CoRR abs/2410.24158 (2024) - 2023
- [c103]Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian:
Semi-Random Sparse Recovery in Nearly-Linear Time. COLT 2023: 2352-2398 - [c102]Yujia Jin, Christopher Musco, Aaron Sidford, Apoorv Vikram Singh:
Moments, Random Walks, and Limits for Spectrum Approximation. COLT 2023: 5373-5394 - [c101]Jan van den Brand, Li Chen, Richard Peng, Rasmus Kyng, Yang P. Liu, Maximilian Probst Gutenberg, Sushant Sachdeva, Aaron Sidford:
A Deterministic Almost-Linear Time Algorithm for Minimum-Cost Flow. FOCS 2023: 503-514 - [c100]AmirMahdi Ahmadinejad, John Peebles, Edward Pyne, Aaron Sidford, Salil P. Vadhan:
Singular Value Approximation and Sparsifying Random Walks on Directed Graphs. FOCS 2023: 846-854 - [c99]Arun Jambulapati, James R. Lee, Yang P. Liu, Aaron Sidford:
Sparsifying Sums of Norms. FOCS 2023: 1953-1962 - [c98]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. FOCS 2023: 2031-2058 - [c97]Andrei Graur, Haotian Jiang, Aaron Sidford:
Sparse Submodular Function Minimization. FOCS 2023: 2071-2080 - [c96]Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian:
Matrix Completion in Almost-Verification Time. FOCS 2023: 2102-2128 - [c95]Adam Bouland, Yosheb M. Getachew, Yujia Jin, Aaron Sidford, Kevin Tian:
Quantum Speedups for Zero-Sum Games via Improved Dynamic Gibbs Sampling. ICML 2023: 2932-2952 - [c94]Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Efficient Convex Optimization Requires Superlinear Memory (Extended Abstract). IJCAI 2023: 6468-6473 - [c93]Yujia Jin, Vidya Muthukumar, Aaron Sidford:
The Complexity of Infinite-Horizon General-Sum Stochastic Games. ITCS 2023: 76:1-76:20 - [c92]Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford:
Parallel Submodular Function Minimization. NeurIPS 2023 - [c91]Rajat Vadiraj Dwaraknath, Ishani Karmarkar, Aaron Sidford:
Towards Optimal Effective Resistance Estimation. NeurIPS 2023 - [c90]Arun Jambulapati, Jerry Li, Christopher Musco, Kirankumar Shiragur, Aaron Sidford, Kevin Tian:
Structured Semidefinite Programming for Recovering Structured Preconditioners. NeurIPS 2023 - [c89]Aaron Sidford, Chenyi Zhang:
Quantum speedups for stochastic optimization. NeurIPS 2023 - [c88]Avi Kadria, Liam Roditty, Aaron Sidford, Virginia Vassilevska Williams, Uri Zwick:
Improved girth approximation in weighted undirected graphs. SODA 2023: 2242-2255 - [c87]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Chaining, Group Leverage Score Overestimates, and Fast Spectral Hypergraph Sparsification. STOC 2023: 196-206 - [c86]Jan van den Brand, Yang P. Liu, Aaron Sidford:
Dynamic Maxflow via Dynamic Interior Point Methods. STOC 2023: 1215-1228 - [i108]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. CoRR abs/2301.00457 (2023) - [i107]Adam Bouland, Yosheb Getachew, Yujia Jin, Aaron Sidford, Kevin Tian:
Quantum Speedups for Zero-Sum Games via Improved Dynamic Gibbs Sampling. CoRR abs/2301.03763 (2023) - [i106]AmirMahdi Ahmadinejad, John Peebles, Edward Pyne, Aaron Sidford, Salil P. Vadhan:
Singular Value Approximation and Reducing Directed to Undirected Graph Sparsification. CoRR abs/2301.13541 (2023) - [i105]Arun Jambulapati, James R. Lee, Yang P. Liu, Aaron Sidford:
Sparsifying Sums of Norms. CoRR abs/2305.09049 (2023) - [i104]Sayan Bhattacharya, Peter Kiss, Aaron Sidford, David Wajc:
Near-Optimal Dynamic Rounding of Fractional Matchings in Bipartite Graphs. CoRR abs/2306.11828 (2023) - [i103]Rajat Vadiraj Dwaraknath, Ishani Karmarkar, Aaron Sidford:
Towards Optimal Effective Resistance Estimation. CoRR abs/2306.14820 (2023) - [i102]Yujia Jin, Christopher Musco, Aaron Sidford, Apoorv Vikram Singh:
Moments, Random Walks, and Limits for Spectrum Approximation. CoRR abs/2307.00474 (2023) - [i101]Aaron Sidford, Chenyi Zhang:
Quantum speedups for stochastic optimization. CoRR abs/2308.01582 (2023) - [i100]Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian:
Matrix Completion in Almost-Verification Time. CoRR abs/2308.03661 (2023) - [i99]Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford:
Parallel Submodular Function Minimization. CoRR abs/2309.04643 (2023) - [i98]Jan van den Brand, Li Chen, Rasmus Kyng, Yang P. Liu, Richard Peng, Maximilian Probst Gutenberg, Sushant Sachdeva, Aaron Sidford:
A Deterministic Almost-Linear Time Algorithm for Minimum-Cost Flow. CoRR abs/2309.16629 (2023) - [i97]Andrei Graur, Haotian Jiang, Aaron Sidford:
Sparse Submodular Function Minimization. CoRR abs/2309.16632 (2023) - [i96]Arun Jambulapati, Jerry Li, Christopher Musco, Kirankumar Shiragur, Aaron Sidford, Kevin Tian:
Structured Semidefinite Programming for Recovering Structured Preconditioners. CoRR abs/2310.18265 (2023) - [i95]Jan van den Brand, Li Chen, Rasmus Kyng, Yang P. Liu, Richard Peng, Maximilian Probst Gutenberg, Sushant Sachdeva, Aaron Sidford:
Incremental Approximate Maximum Flow on Undirected Graphs in Subpolynomial Update Time. CoRR abs/2311.03174 (2023) - [i94]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
A Whole New Ball Game: A Primal Accelerated Method for Matrix Games and Minimizing the Maximum of Smooth Functions. CoRR abs/2311.10886 (2023) - [i93]Arun Jambulapati, James R. Lee, Yang P. Liu, Aaron Sidford:
Sparsifying generalized linear models. CoRR abs/2311.18145 (2023) - [i92]Jiale Chen, Aaron Sidford, Ta-Wei Tu:
Entropy Regularization and Faster Decremental Matching in General Graphs. CoRR abs/2312.09077 (2023) - 2022
- [c85]Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Efficient Convex Optimization Requires Superlinear Memory. COLT 2022: 2390-2430 - [c84]Jonathan A. Kelner, Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant, Honglin Yuan:
Big-Step-Little-Step: Efficient Gradient Methods for Objectives with Multiple Scales. COLT 2022: 2431-2540 - [c83]Yujia Jin, Aaron Sidford, Kevin Tian:
Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods. COLT 2022: 4362-4415 - [c82]Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford:
Improved Lower Bounds for Submodular Function Minimization. FOCS 2022: 245-254 - [c81]Aaron Bernstein, Jan van den Brand, Maximilian Probst Gutenberg, Danupon Nanongkai, Thatchaphol Saranurak, Aaron Sidford, He Sun:
Fully-Dynamic Graph Sparsifiers Against an Adaptive Adversary. ICALP 2022: 20:1-20:20 - [c80]Arun Jambulapati, Yujia Jin, Aaron Sidford, Kevin Tian:
Regularized Box-Simplex Games and Dynamic Decremental Bipartite Matching. ICALP 2022: 77:1-77:20 - [c79]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
RECAPP: Crafting a More Efficient Catalyst for Convex Optimization. ICML 2022: 2658-2685 - [c78]Yair Carmon, Danielle Hausler, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Optimal and Adaptive Monteiro-Svaiter Acceleration. NeurIPS 2022 - [c77]Moses Charikar, Zhihao Jiang, Kirankumar Shiragur, Aaron Sidford:
On the Efficient Implementation of High Accuracy Optimality of Profile Maximum Likelihood. NeurIPS 2022 - [c76]Sepehr Assadi, Arun Jambulapati, Yujia Jin, Aaron Sidford, Kevin Tian:
Semi-Streaming Bipartite Matching in Fewer Passes and Optimal Space. SODA 2022: 627-669 - [c75]Avi Kadria, Liam Roditty, Aaron Sidford, Virginia Vassilevska Williams, Uri Zwick:
Algorithmic trade-offs for girth approximation in undirected graphs. SODA 2022: 1471-1492 - [c74]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. SODA 2022: 2723-2742 - [c73]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Improved iteration complexities for overconstrained p-norm regression. STOC 2022: 529-542 - [c72]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster maxflow via improved dynamic spectral vertex sparsifiers. STOC 2022: 543-556 - [i91]Yujia Jin, Aaron Sidford, Kevin Tian:
Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods. CoRR abs/2202.04640 (2022) - [i90]Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian:
Semi-Random Sparse Recovery in Nearly-Linear Time. CoRR abs/2203.04002 (2022) - [i89]Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Efficient Convex Optimization Requires Superlinear Memory. CoRR abs/2203.15260 (2022) - [i88]Yujia Jin, Vidya Muthukumar, Aaron Sidford:
The Complexity of Infinite-Horizon General-Sum Stochastic Games. CoRR abs/2204.04186 (2022) - [i87]Arun Jambulapati, Yujia Jin, Aaron Sidford, Kevin Tian:
Regularized Box-Simplex Games and Dynamic Decremental Bipartite Matching. CoRR abs/2204.12721 (2022) - [i86]Yair Carmon, Danielle Hausler, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Optimal and Adaptive Monteiro-Svaiter Acceleration. CoRR abs/2205.15371 (2022) - [i85]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
RECAPP: Crafting a More Efficient Catalyst for Convex Optimization. CoRR abs/2206.08627 (2022) - [i84]Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford:
Improved Lower Bounds for Submodular Function Minimization. CoRR abs/2207.04342 (2022) - [i83]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Chaining, Group Leverage Score Overestimates, and Fast Spectral Hypergraph Sparsification. CoRR abs/2209.10539 (2022) - [i82]Moses Charikar, Zhihao Jiang, Kirankumar Shiragur, Aaron Sidford:
On the Efficient Implementation of High Accuracy Optimality of Profile Maximum Likelihood. CoRR abs/2210.06728 (2022) - [i81]Jan van den Brand, Yang P. Liu, Aaron Sidford:
Dynamic Maxflow via Dynamic Interior Point Methods. CoRR abs/2212.06315 (2022) - 2021
- [j8]Yair Carmon, John C. Duchi, Oliver Hinder, Aaron Sidford:
Lower bounds for finding stationary points II: first-order methods. Math. Program. 185(1-2): 315-355 (2021) - [j7]Jack Murtagh, Omer Reingold, Aaron Sidford, Salil P. Vadhan:
Derandomization beyond Connectivity: Undirected Laplacian Systems in Nearly Logarithmic Space. SIAM J. Comput. 50(6): 1892-1922 (2021) - [j6]Jack Murtagh, Omer Reingold, Aaron Sidford, Salil P. Vadhan:
Deterministic Approximation of Random Walks in Small Space. Adv. Math. Commun. 17: 1-35 (2021) - [c71]Nima Anari, Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications for Profile Maximum Likelihood. COLT 2021: 93-158 - [c70]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss. COLT 2021: 866-882 - [c69]Yujia Jin, Aaron Sidford:
Towards Tight Bounds on the Sample Complexity of Average-reward MDPs. ICML 2021: 5055-5064 - [c68]Michael B. Cohen, Aaron Sidford, Kevin Tian:
Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration. ITCS 2021: 62:1-62:18 - [c67]Hilal Asi, Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Stochastic Bias-Reduced Gradient Methods. NeurIPS 2021: 10810-10822 - [c66]Arun Jambulapati, Aaron Sidford:
Ultrasparse Ultrasparsifiers and Faster Laplacian System Solvers. SODA 2021: 540-559 - [c65]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum cost flows, MDPs, and ℓ1-regression in nearly linear time for dense instances. STOC 2021: 859-869 - [i80]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum Cost Flows, MDPs, and 𝓁1-Regression in Nearly Linear Time for Dense Instances. CoRR abs/2101.05719 (2021) - [i79]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss. CoRR abs/2105.01778 (2021) - [i78]Yujia Jin, Aaron Sidford:
Towards Tight Bounds on the Sample Complexity of Average-reward MDPs. CoRR abs/2106.07046 (2021) - [i77]Hilal Asi, Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Stochastic Bias-Reduced Gradient Methods. CoRR abs/2106.09481 (2021) - [i76]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. CoRR abs/2110.15563 (2021) - [i75]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Improved Iteration Complexities for Overconstrained p-Norm Regression. CoRR abs/2111.01848 (2021) - [i74]Jonathan A. Kelner, Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant, Honglin Yuan:
Big-Step-Little-Step: Efficient Gradient Methods for Objectives with Multiple Scales. CoRR abs/2111.03137 (2021) - [i73]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster Maxflow via Improved Dynamic Spectral Vertex Sparsifiers. CoRR abs/2112.00722 (2021) - 2020
- [j5]Yair Carmon, John C. Duchi, Oliver Hinder, Aaron Sidford:
Lower bounds for finding stationary points I. Math. Program. 184(1): 71-120 (2020) - [c64]Aaron Sidford, Mengdi Wang, Lin Yang, Yinyu Ye:
Solving Discounted Stochastic Two-Player Games with Near-Optimal Time and Sample Complexity. AISTATS 2020: 2992-3002 - [c63]Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford:
Leverage Score Sampling for Faster Accelerated Regression and ERM. ALT 2020: 22-47 - [c62]Oliver Hinder, Aaron Sidford, Nimit Sharad Sohoni:
Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond. COLT 2020: 1894-1938 - [c61]Tarun Kathuria, Yang P. Liu, Aaron Sidford:
Unit Capacity Maxflow in Almost $O(m^{4/3})$ Time. FOCS 2020: 119-130 - [c60]Yair Carmon, Yujia Jin, Aaron Sidford, Kevin Tian:
Coordinate Methods for Matrix Games. FOCS 2020: 283-293 - [c59]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. FOCS 2020: 919-930 - [c58]AmirMahdi Ahmadinejad, Jonathan A. Kelner, Jack Murtagh, John Peebles, Aaron Sidford, Salil P. Vadhan:
High-precision Estimation of Random Walks in Small Space. FOCS 2020: 1295-1306 - [c57]Yujia Jin, Aaron Sidford:
Efficiently Solving MDPs with Stochastic Mirror Descent. ICML 2020: 4890-4900 - [c56]Nima Anari, Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
Instance Based Approximations to Profile Maximum Likelihood. NeurIPS 2020 - [c55]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. NeurIPS 2020 - [c54]Daniel Levy, Yair Carmon, John C. Duchi, Aaron Sidford:
Large-Scale Methods for Distributionally Robust Optimization. NeurIPS 2020 - [c53]Brian Axelrod, Yang P. Liu, Aaron Sidford:
Near-optimal Approximate Discrete and Continuous Submodular Function Minimization. SODA 2020: 837-853 - [c52]Michael Kapralov, Aida Mousavifar, Cameron Musco, Christopher Musco, Navid Nouri, Aaron Sidford, Jakab Tardos:
Fast and Space Efficient Spectral Sparsification in Dynamic Streams. SODA 2020: 1814-1833 - [c51]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving tall dense linear programs in nearly linear time. STOC 2020: 775-788 - [c50]Yang P. Liu, Aaron Sidford:
Faster energy maximization for faster maximum flow. STOC 2020: 803-814 - [c49]Shiri Chechik, Yang P. Liu, Omer Rotem, Aaron Sidford:
Constant girth approximation for directed graphs in subquadratic time. STOC 2020: 1010-1023 - [i72]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving Tall Dense Linear Programs in Nearly Linear Time. CoRR abs/2002.02304 (2020) - [i71]Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
A General Framework for Symmetric Property Estimation. CoRR abs/2003.00844 (2020) - [i70]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. CoRR abs/2003.08078 (2020) - [i69]Yang P. Liu, Aaron Sidford:
Faster Divergence Maximization for Faster Maximum Flow. CoRR abs/2003.08929 (2020) - [i68]Nima Anari, Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications for Profile Maximum Likelihood. CoRR abs/2004.02425 (2020) - [i67]Aaron Bernstein, Jan van den Brand, Maximilian Probst Gutenberg, Danupon Nanongkai, Thatchaphol Saranurak, Aaron Sidford, He Sun:
Fully-Dynamic Graph Sparsifiers Against an Adaptive Adversary. CoRR abs/2004.08432 (2020) - [i66]Jerry Li, Aaron Sidford, Kevin Tian, Huishuai Zhang:
Well-Conditioned Methods for Ill-Conditioned Systems: Linear Regression with Semi-Random Noise. CoRR abs/2008.01722 (2020) - [i65]