


Остановите войну!
for scientists:


default search action
Aaron Sidford
Person information

- affiliation: Stanford University, CA, USA
Refine list

refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2023
- [c93]Jonathan A. Kelner, Jerry Li, Allen X. Liu, Aaron Sidford, Kevin Tian:
Semi-Random Sparse Recovery in Nearly-Linear Time. COLT 2023: 2352-2398 - [c92]Yujia Jin, Christopher Musco, Aaron Sidford, Apoorv Vikram Singh:
Moments, Random Walks, and Limits for Spectrum Approximation. COLT 2023: 5373-5394 - [c91]Adam Bouland, Yosheb M. Getachew, Yujia Jin, Aaron Sidford, Kevin Tian:
Quantum Speedups for Zero-Sum Games via Improved Dynamic Gibbs Sampling. ICML 2023: 2932-2952 - [c90]Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Efficient Convex Optimization Requires Superlinear Memory (Extended Abstract). IJCAI 2023: 6468-6473 - [c89]Yujia Jin, Vidya Muthukumar, Aaron Sidford:
The Complexity of Infinite-Horizon General-Sum Stochastic Games. ITCS 2023: 76:1-76:20 - [c88]Avi Kadria, Liam Roditty, Aaron Sidford, Virginia Vassilevska Williams, Uri Zwick:
Improved girth approximation in weighted undirected graphs. SODA 2023: 2242-2255 - [c87]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Chaining, Group Leverage Score Overestimates, and Fast Spectral Hypergraph Sparsification. STOC 2023: 196-206 - [c86]Jan van den Brand
, Yang P. Liu, Aaron Sidford:
Dynamic Maxflow via Dynamic Interior Point Methods. STOC 2023: 1215-1228 - [i106]Yair Carmon, Arun Jambulapati, Yujia Jin, Yin Tat Lee, Daogao Liu, Aaron Sidford, Kevin Tian:
ReSQueing Parallel and Private Stochastic Convex Optimization. CoRR abs/2301.00457 (2023) - [i105]Adam Bouland, Yosheb Getachew, Yujia Jin, Aaron Sidford, Kevin Tian:
Quantum Speedups for Zero-Sum Games via Improved Dynamic Gibbs Sampling. CoRR abs/2301.03763 (2023) - [i104]AmirMahdi Ahmadinejad, John Peebles, Edward Pyne, Aaron Sidford, Salil P. Vadhan:
Singular Value Approximation and Reducing Directed to Undirected Graph Sparsification. CoRR abs/2301.13541 (2023) - [i103]Arun Jambulapati, James R. Lee, Yang P. Liu, Aaron Sidford:
Sparsifying Sums of Norms. CoRR abs/2305.09049 (2023) - [i102]Sayan Bhattacharya, Peter Kiss, Aaron Sidford, David Wajc:
Near-Optimal Dynamic Rounding of Fractional Matchings in Bipartite Graphs. CoRR abs/2306.11828 (2023) - [i101]Rajat Vadiraj Dwaraknath, Ishani Karmarkar, Aaron Sidford:
Towards Optimal Effective Resistance Estimation. CoRR abs/2306.14820 (2023) - [i100]Yujia Jin, Christopher Musco, Aaron Sidford, Apoorv Vikram Singh:
Moments, Random Walks, and Limits for Spectrum Approximation. CoRR abs/2307.00474 (2023) - [i99]Aaron Sidford, Chenyi Zhang:
Quantum speedups for stochastic optimization. CoRR abs/2308.01582 (2023) - [i98]Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian:
Matrix Completion in Almost-Verification Time. CoRR abs/2308.03661 (2023) - [i97]Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford:
Parallel Submodular Function Minimization. CoRR abs/2309.04643 (2023) - [i96]Jan van den Brand, Li Chen, Rasmus Kyng, Yang P. Liu, Richard Peng, Maximilian Probst Gutenberg, Sushant Sachdeva, Aaron Sidford:
A Deterministic Almost-Linear Time Algorithm for Minimum-Cost Flow. CoRR abs/2309.16629 (2023) - [i95]Andrei Graur, Haotian Jiang, Aaron Sidford:
Sparse Submodular Function Minimization. CoRR abs/2309.16632 (2023) - [i94]Arun Jambulapati, Jerry Li, Christopher Musco, Kirankumar Shiragur, Aaron Sidford, Kevin Tian:
Structured Semidefinite Programming for Recovering Structured Preconditioners. CoRR abs/2310.18265 (2023) - [i93]Jan van den Brand, Li Chen, Rasmus Kyng, Yang P. Liu, Richard Peng, Maximilian Probst Gutenberg, Sushant Sachdeva, Aaron Sidford:
Incremental Approximate Maximum Flow on Undirected Graphs in Subpolynomial Update Time. CoRR abs/2311.03174 (2023) - [i92]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
A Whole New Ball Game: A Primal Accelerated Method for Matrix Games and Minimizing the Maximum of Smooth Functions. CoRR abs/2311.10886 (2023) - 2022
- [c85]Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Efficient Convex Optimization Requires Superlinear Memory. COLT 2022: 2390-2430 - [c84]Jonathan A. Kelner, Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant, Honglin Yuan:
Big-Step-Little-Step: Efficient Gradient Methods for Objectives with Multiple Scales. COLT 2022: 2431-2540 - [c83]Yujia Jin, Aaron Sidford, Kevin Tian:
Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods. COLT 2022: 4362-4415 - [c82]Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford:
Improved Lower Bounds for Submodular Function Minimization. FOCS 2022: 245-254 - [c81]Aaron Bernstein, Jan van den Brand, Maximilian Probst Gutenberg, Danupon Nanongkai, Thatchaphol Saranurak
, Aaron Sidford, He Sun:
Fully-Dynamic Graph Sparsifiers Against an Adaptive Adversary. ICALP 2022: 20:1-20:20 - [c80]Arun Jambulapati, Yujia Jin, Aaron Sidford, Kevin Tian:
Regularized Box-Simplex Games and Dynamic Decremental Bipartite Matching. ICALP 2022: 77:1-77:20 - [c79]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
RECAPP: Crafting a More Efficient Catalyst for Convex Optimization. ICML 2022: 2658-2685 - [c78]Yair Carmon, Danielle Hausler, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Optimal and Adaptive Monteiro-Svaiter Acceleration. NeurIPS 2022 - [c77]Moses Charikar, Zhihao Jiang, Kirankumar Shiragur, Aaron Sidford:
On the Efficient Implementation of High Accuracy Optimality of Profile Maximum Likelihood. NeurIPS 2022 - [c76]Sepehr Assadi, Arun Jambulapati, Yujia Jin, Aaron Sidford, Kevin Tian:
Semi-Streaming Bipartite Matching in Fewer Passes and Optimal Space. SODA 2022: 627-669 - [c75]Avi Kadria, Liam Roditty, Aaron Sidford, Virginia Vassilevska Williams, Uri Zwick:
Algorithmic trade-offs for girth approximation in undirected graphs. SODA 2022: 1471-1492 - [c74]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. SODA 2022: 2723-2742 - [c73]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Improved iteration complexities for overconstrained p-norm regression. STOC 2022: 529-542 - [c72]Jan van den Brand
, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster maxflow via improved dynamic spectral vertex sparsifiers. STOC 2022: 543-556 - [i91]Yujia Jin, Aaron Sidford, Kevin Tian:
Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods. CoRR abs/2202.04640 (2022) - [i90]Jonathan A. Kelner, Jerry Li, Allen Liu, Aaron Sidford, Kevin Tian:
Semi-Random Sparse Recovery in Nearly-Linear Time. CoRR abs/2203.04002 (2022) - [i89]Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Efficient Convex Optimization Requires Superlinear Memory. CoRR abs/2203.15260 (2022) - [i88]Yujia Jin, Vidya Muthukumar, Aaron Sidford:
The Complexity of Infinite-Horizon General-Sum Stochastic Games. CoRR abs/2204.04186 (2022) - [i87]Arun Jambulapati, Yujia Jin, Aaron Sidford, Kevin Tian:
Regularized Box-Simplex Games and Dynamic Decremental Bipartite Matching. CoRR abs/2204.12721 (2022) - [i86]Yair Carmon, Danielle Hausler, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Optimal and Adaptive Monteiro-Svaiter Acceleration. CoRR abs/2205.15371 (2022) - [i85]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
RECAPP: Crafting a More Efficient Catalyst for Convex Optimization. CoRR abs/2206.08627 (2022) - [i84]Deeparnab Chakrabarty, Andrei Graur, Haotian Jiang, Aaron Sidford:
Improved Lower Bounds for Submodular Function Minimization. CoRR abs/2207.04342 (2022) - [i83]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Chaining, Group Leverage Score Overestimates, and Fast Spectral Hypergraph Sparsification. CoRR abs/2209.10539 (2022) - [i82]Moses Charikar
, Zhihao Jiang, Kirankumar Shiragur, Aaron Sidford:
On the Efficient Implementation of High Accuracy Optimality of Profile Maximum Likelihood. CoRR abs/2210.06728 (2022) - [i81]Jan van den Brand, Yang P. Liu, Aaron Sidford:
Dynamic Maxflow via Dynamic Interior Point Methods. CoRR abs/2212.06315 (2022) - 2021
- [j8]Yair Carmon
, John C. Duchi, Oliver Hinder, Aaron Sidford:
Lower bounds for finding stationary points II: first-order methods. Math. Program. 185(1-2): 315-355 (2021) - [j7]Jack Murtagh, Omer Reingold, Aaron Sidford, Salil P. Vadhan:
Derandomization beyond Connectivity: Undirected Laplacian Systems in Nearly Logarithmic Space. SIAM J. Comput. 50(6): 1892-1922 (2021) - [j6]Jack Murtagh, Omer Reingold, Aaron Sidford, Salil P. Vadhan:
Deterministic Approximation of Random Walks in Small Space. Theory Comput. 17: 1-35 (2021) - [c71]Nima Anari, Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications for Profile Maximum Likelihood. COLT 2021: 93-158 - [c70]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss. COLT 2021: 866-882 - [c69]Yujia Jin, Aaron Sidford:
Towards Tight Bounds on the Sample Complexity of Average-reward MDPs. ICML 2021: 5055-5064 - [c68]Michael B. Cohen, Aaron Sidford, Kevin Tian:
Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration. ITCS 2021: 62:1-62:18 - [c67]Hilal Asi, Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Stochastic Bias-Reduced Gradient Methods. NeurIPS 2021: 10810-10822 - [c66]Arun Jambulapati, Aaron Sidford:
Ultrasparse Ultrasparsifiers and Faster Laplacian System Solvers. SODA 2021: 540-559 - [c65]Jan van den Brand
, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak
, Aaron Sidford, Zhao Song, Di Wang:
Minimum cost flows, MDPs, and ℓ1-regression in nearly linear time for dense instances. STOC 2021: 859-869 - [i80]Jan van den Brand, Yin Tat Lee, Yang P. Liu, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Minimum Cost Flows, MDPs, and 𝓁1-Regression in Nearly Linear Time for Dense Instances. CoRR abs/2101.05719 (2021) - [i79]Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Thinking Inside the Ball: Near-Optimal Minimization of the Maximal Loss. CoRR abs/2105.01778 (2021) - [i78]Yujia Jin, Aaron Sidford:
Towards Tight Bounds on the Sample Complexity of Average-reward MDPs. CoRR abs/2106.07046 (2021) - [i77]Hilal Asi, Yair Carmon, Arun Jambulapati, Yujia Jin, Aaron Sidford:
Stochastic Bias-Reduced Gradient Methods. CoRR abs/2106.09481 (2021) - [i76]Maryam Fazel, Yin Tat Lee, Swati Padmanabhan, Aaron Sidford:
Computing Lewis Weights to High Precision. CoRR abs/2110.15563 (2021) - [i75]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Improved Iteration Complexities for Overconstrained p-Norm Regression. CoRR abs/2111.01848 (2021) - [i74]Jonathan A. Kelner, Annie Marsden, Vatsal Sharan, Aaron Sidford, Gregory Valiant, Honglin Yuan:
Big-Step-Little-Step: Efficient Gradient Methods for Objectives with Multiple Scales. CoRR abs/2111.03137 (2021) - [i73]Jan van den Brand, Yu Gao, Arun Jambulapati, Yin Tat Lee, Yang P. Liu, Richard Peng, Aaron Sidford:
Faster Maxflow via Improved Dynamic Spectral Vertex Sparsifiers. CoRR abs/2112.00722 (2021) - 2020
- [j5]Yair Carmon
, John C. Duchi, Oliver Hinder, Aaron Sidford:
Lower bounds for finding stationary points I. Math. Program. 184(1): 71-120 (2020) - [c64]Aaron Sidford, Mengdi Wang, Lin Yang
, Yinyu Ye:
Solving Discounted Stochastic Two-Player Games with Near-Optimal Time and Sample Complexity. AISTATS 2020: 2992-3002 - [c63]Naman Agarwal, Sham M. Kakade, Rahul Kidambi, Yin Tat Lee, Praneeth Netrapalli, Aaron Sidford:
Leverage Score Sampling for Faster Accelerated Regression and ERM. ALT 2020: 22-47 - [c62]Oliver Hinder, Aaron Sidford, Nimit Sharad Sohoni:
Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond. COLT 2020: 1894-1938 - [c61]Tarun Kathuria, Yang P. Liu, Aaron Sidford:
Unit Capacity Maxflow in Almost $O(m^{4/3})$ Time. FOCS 2020: 119-130 - [c60]Yair Carmon, Yujia Jin, Aaron Sidford, Kevin Tian:
Coordinate Methods for Matrix Games. FOCS 2020: 283-293 - [c59]Jan van den Brand
, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak
, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. FOCS 2020: 919-930 - [c58]AmirMahdi Ahmadinejad, Jonathan A. Kelner, Jack Murtagh, John Peebles, Aaron Sidford, Salil P. Vadhan:
High-precision Estimation of Random Walks in Small Space. FOCS 2020: 1295-1306 - [c57]Yujia Jin, Aaron Sidford:
Efficiently Solving MDPs with Stochastic Mirror Descent. ICML 2020: 4890-4900 - [c56]Nima Anari, Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
Instance Based Approximations to Profile Maximum Likelihood. NeurIPS 2020 - [c55]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. NeurIPS 2020 - [c54]Daniel Levy, Yair Carmon, John C. Duchi, Aaron Sidford:
Large-Scale Methods for Distributionally Robust Optimization. NeurIPS 2020 - [c53]Brian Axelrod, Yang P. Liu, Aaron Sidford:
Near-optimal Approximate Discrete and Continuous Submodular Function Minimization. SODA 2020: 837-853 - [c52]Michael Kapralov, Aida Mousavifar, Cameron Musco, Christopher Musco
, Navid Nouri, Aaron Sidford, Jakab Tardos:
Fast and Space Efficient Spectral Sparsification in Dynamic Streams. SODA 2020: 1814-1833 - [c51]Jan van den Brand
, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving tall dense linear programs in nearly linear time. STOC 2020: 775-788 - [c50]Yang P. Liu, Aaron Sidford:
Faster energy maximization for faster maximum flow. STOC 2020: 803-814 - [c49]Shiri Chechik, Yang P. Liu, Omer Rotem, Aaron Sidford:
Constant girth approximation for directed graphs in subquadratic time. STOC 2020: 1010-1023 - [i72]Jan van den Brand, Yin Tat Lee, Aaron Sidford, Zhao Song:
Solving Tall Dense Linear Programs in Nearly Linear Time. CoRR abs/2002.02304 (2020) - [i71]Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
A General Framework for Symmetric Property Estimation. CoRR abs/2003.00844 (2020) - [i70]Yair Carmon, Arun Jambulapati, Qijia Jiang, Yujia Jin, Yin Tat Lee, Aaron Sidford, Kevin Tian:
Acceleration with a Ball Optimization Oracle. CoRR abs/2003.08078 (2020) - [i69]Yang P. Liu, Aaron Sidford:
Faster Divergence Maximization for Faster Maximum Flow. CoRR abs/2003.08929 (2020) - [i68]Nima Anari, Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
The Bethe and Sinkhorn Permanents of Low Rank Matrices and Implications for Profile Maximum Likelihood. CoRR abs/2004.02425 (2020) - [i67]Aaron Bernstein, Jan van den Brand, Maximilian Probst Gutenberg, Danupon Nanongkai, Thatchaphol Saranurak, Aaron Sidford, He Sun:
Fully-Dynamic Graph Sparsifiers Against an Adaptive Adversary. CoRR abs/2004.08432 (2020) - [i66]Jerry Li, Aaron Sidford, Kevin Tian, Huishuai Zhang:
Well-Conditioned Methods for Ill-Conditioned Systems: Linear Regression with Semi-Random Noise. CoRR abs/2008.01722 (2020) - [i65]Yujia Jin, Aaron Sidford:
Efficiently Solving MDPs with Stochastic Mirror Descent. CoRR abs/2008.12776 (2020) - [i64]Jan van den Brand, Yin Tat Lee, Danupon Nanongkai, Richard Peng, Thatchaphol Saranurak, Aaron Sidford, Zhao Song, Di Wang:
Bipartite Matching in Nearly-linear Time on Moderately Dense Graphs. CoRR abs/2009.01802 (2020) - [i63]Yair Carmon, Yujia Jin, Aaron Sidford, Kevin Tian:
Coordinate Methods for Matrix Games. CoRR abs/2009.08447 (2020) - [i62]Daniel Levy, Yair Carmon, John C. Duchi, Aaron Sidford:
Large-Scale Methods for Distributionally Robust Optimization. CoRR abs/2010.05893 (2020) - [i61]Nima Anari, Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
Instance Based Approximations to Profile Maximum Likelihood. CoRR abs/2011.02761 (2020) - [i60]Yujia Jin, Aaron Sidford, Kevin Tian:
Semi-Streaming Bipartite Matching in Fewer Passes and Less Space. CoRR abs/2011.03495 (2020) - [i59]Michael B. Cohen, Aaron Sidford, Kevin Tian:
Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration. CoRR abs/2011.06572 (2020) - [i58]Arun Jambulapati, Aaron Sidford:
Ultrasparse Ultrasparsifiers and Faster Laplacian System Solvers. CoRR abs/2011.08806 (2020)
2010 – 2019
- 2019
- [c48]Jack Murtagh, Omer Reingold, Aaron Sidford, Salil P. Vadhan:
Deterministic Approximation of Random Walks in Small Space. APPROX-RANDOM 2019: 42:1-42:22 - [c47]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near-optimal method for highly smooth convex optimization. COLT 2019: 492-507 - [c46]Yair Carmon, John C. Duchi, Aaron Sidford, Kevin Tian:
A Rank-1 Sketch for Matrix Multiplicative Weights. COLT 2019: 589-623 - [c45]Alexander V. Gasnikov, Pavel E. Dvurechensky, Eduard Gorbunov, Evgeniya A. Vorontsova, Daniil Selikhanovych, César A. Uribe, Bo Jiang, Haoyue Wang, Shuzhong Zhang, Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives. COLT 2019: 1392-1393 - [c44]Deeparnab Chakrabarty, Yin Tat Lee, Aaron Sidford, Sahil Singla, Sam Chiu-wai Wong:
Faster Matroid Intersection. FOCS 2019: 1146-1168 - [c43]Yang P. Liu, Arun Jambulapati, Aaron Sidford:
Parallel Reachability in Almost Linear Work and Square Root Depth. FOCS 2019: 1664-1686 - [c42]Yujia Jin, Aaron Sidford:
Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG. NeurIPS 2019: 3863-3873 - [c41]Arun Jambulapati, Aaron Sidford, Kevin Tian:
A Direct tilde{O}(1/epsilon) Iteration Parallel Algorithm for Optimal Transport. NeurIPS 2019: 11355-11366 - [c40]Yair Carmon, Yujia Jin, Aaron Sidford, Kevin Tian:
Variance Reduction for Matrix Games. NeurIPS 2019: 11377-11388 - [c39]Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
A General Framework for Symmetric Property Estimation. NeurIPS 2019: 12426-12436 - [c38]Sébastien Bubeck, Qijia Jiang, Yin Tat Lee, Yuanzhi Li, Aaron Sidford:
Complexity of Highly Parallel Non-Smooth Convex Optimization. NeurIPS 2019: 13900-13909 - [c37]AmirMahdi Ahmadinejad, Arun Jambulapati, Amin Saberi, Aaron Sidford:
Perron-Frobenius Theory in Nearly Linear Time: Positive Eigenvectors, M-matrices, Graph Kernels, and Other Applications. SODA 2019: 1387-1404 - [c36]Moses Charikar
, Kirankumar Shiragur, Aaron Sidford:
Efficient profile maximum likelihood for universal symmetric property estimation. STOC 2019: 780-791 - [c35]Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Memory-sample tradeoffs for linear regression with small error. STOC 2019: 890-901 - [i57]Yair Carmon, John C. Duchi, Aaron Sidford, Kevin Tian:
A Rank-1 Sketch for Matrix Multiplicative Weights. CoRR abs/1903.02675 (2019) - [i56]Jack Murtagh, Omer Reingold, Aaron Sidford, Salil P. Vadhan:
Deterministic Approximation of Random Walks in Small Space. CoRR abs/1903.06361 (2019) - [i55]Michael Kapralov, Navid Nouri, Aaron Sidford, Jakab Tardos:
Dynamic Streaming Spectral Sparsification in Nearly Linear Time and Space. CoRR abs/1903.12150 (2019) - [i54]Vatsal Sharan, Aaron Sidford, Gregory Valiant:
Memory-Sample Tradeoffs for Linear Regression with Small Error. CoRR abs/1904.08544 (2019) - [i53]Moses Charikar, Kirankumar Shiragur, Aaron Sidford:
Efficient Profile Maximum Likelihood for Universal Symmetric Property Estimation. CoRR abs/1905.08448 (2019) - [i52]Arun Jambulapati, Yang P. Liu, Aaron Sidford:
Parallel Reachability in Almost Linear Work and Square Root Depth. CoRR abs/1905.08841 (2019) - [i51]