


Остановите войну!
for scientists:


default search action
Francis R. Bach
Person information

- affiliation: École Normale Supérieure, Computer Science Department
Refine list

refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2023
- [j68]Belinda Tzen, Anant Raj
, Maxim Raginsky
, Francis R. Bach
:
Variational Principles for Mirror Descent and Mirror Langevin Dynamics. IEEE Control. Syst. Lett. 7: 1542-1547 (2023) - [j67]Marc Lambert
, Silvère Bonnabel, Francis R. Bach:
The limited-memory recursive variational Gaussian approximation (L-RVGA). Stat. Comput. 33(3): 70 (2023) - [j66]Francis R. Bach
:
Information Theory With Kernel Methods. IEEE Trans. Inf. Theory 69(2): 752-775 (2023) - [i179]Francis R. Bach:
On the relationship between multivariate splines and infinitely-wide neural networks. CoRR abs/2302.03459 (2023) - [i178]Blake E. Woodworth, Konstantin Mishchenko, Francis R. Bach:
Two Losses Are Better Than One: Faster Optimization Using a Cheaper Proxy. CoRR abs/2302.03542 (2023) - [i177]Loucas Pillaud-Vivien, Francis R. Bach:
Kernelized Diffusion maps. CoRR abs/2302.06757 (2023) - [i176]Francis R. Bach:
High-dimensional analysis of double descent for linear regression with random projections. CoRR abs/2303.01372 (2023) - [i175]David Holzmüller, Francis R. Bach:
Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation. CoRR abs/2303.03237 (2023) - [i174]Belinda Tzen, Anant Raj, Maxim Raginsky, Francis R. Bach:
Variational Principles for Mirror Descent and Mirror Langevin Dynamics. CoRR abs/2303.09532 (2023) - [i173]Saeed Saremi, Rupesh Kumar Srivastava, Francis R. Bach:
Universal Smoothed Score Functions for Generative Modeling. CoRR abs/2303.11669 (2023) - [i172]Marc Lambert, Silvère Bonnabel, Francis R. Bach:
The limited-memory recursive variational Gaussian approximation (L-RVGA). CoRR abs/2303.14195 (2023) - 2022
- [j65]Mathieu Barré, Adrien B. Taylor, Francis R. Bach:
A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives. Open J. Math. Optim. 3: 1-15 (2022) - [j64]Yifan Sun, Francis R. Bach:
Screening for a Reweighted Penalized Conditional Gradient Method. Open J. Math. Optim. 3: 1-35 (2022) - [j63]Théo Ryffel, Pierre Tholoniat, David Pointcheval, Francis R. Bach:
AriaNN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing. Proc. Priv. Enhancing Technol. 2022(1): 291-316 (2022) - [j62]Marc Lambert
, Silvère Bonnabel, Francis R. Bach:
The recursive variational Gaussian approximation (R-VGA). Stat. Comput. 32(1): 10 (2022) - [j61]Alexandre Défossez, Léon Bottou, Francis R. Bach, Nicolas Usunier:
A Simple Convergence Proof of Adam and Adagrad. Trans. Mach. Learn. Res. 2022 (2022) - [c184]Ulysse Marteau-Ferey, Francis R. Bach, Alessandro Rudi:
Sampling from Arbitrary Functions via PSD Models. AISTATS 2022: 2823-2861 - [c183]Alex Nowak, Alessandro Rudi, Francis R. Bach:
On the Consistency of Max-Margin Losses. AISTATS 2022: 4612-4633 - [c182]Eloïse Berthier, Justin Carpentier, Alessandro Rudi, Francis R. Bach:
Infinite-Dimensional Sums-of-Squares for Optimal Control. CDC 2022: 577-582 - [c181]Marc Lambert, Silvère Bonnabel, Francis R. Bach:
The continuous-discrete variational Kalman filter (CD-VKF). CDC 2022: 6632-6639 - [c180]Blake E. Woodworth, Francis R. Bach, Alessandro Rudi:
Non-Convex Optimization with Certificates and Fast Rates Through Kernel Sums of Squares. COLT 2022: 4620-4642 - [c179]Antonio Orvieto, Hans Kersting, Frank Proske, Francis R. Bach, Aurélien Lucchi:
Anticorrelated Noise Injection for Improved Generalization. ICML 2022: 17094-17116 - [c178]Anant Raj, Francis R. Bach:
Convergence of Uncertainty Sampling for Active Learning. ICML 2022: 18310-18331 - [c177]Eloïse Berthier, Ziad Kobeissi, Francis R. Bach:
A Non-asymptotic Analysis of Non-parametric Temporal-Difference Learning. NeurIPS 2022 - [c176]Vivien Cabannes, Francis R. Bach, Vianney Perchet, Alessandro Rudi:
Active Labeling: Streaming Stochastic Gradients. NeurIPS 2022 - [c175]Benjamin Dubois-Taine, Francis R. Bach, Quentin Berthet, Adrien B. Taylor:
Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization. NeurIPS 2022 - [c174]Marc Lambert, Sinho Chewi, Francis R. Bach, Silvère Bonnabel, Philippe Rigollet:
Variational inference via Wasserstein gradient flows. NeurIPS 2022 - [c173]Aurélien Lucchi, Frank Proske, Antonio Orvieto, Francis R. Bach, Hans Kersting:
On the Theoretical Properties of Noise Correlation in Stochastic Optimization. NeurIPS 2022 - [c172]Konstantin Mishchenko, Francis R. Bach, Mathieu Even, Blake E. Woodworth:
Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays. NeurIPS 2022 - [i171]Théo Ryffel, Francis R. Bach, David Pointcheval:
Differential Privacy Guarantees for Stochastic Gradient Langevin Dynamics. CoRR abs/2201.11980 (2022) - [i170]Antonio Orvieto, Hans Kersting, Frank Proske, Francis R. Bach, Aurélien Lucchi:
Anticorrelated Noise Injection for Improved Generalization. CoRR abs/2202.02831 (2022) - [i169]Ziad Kobeissi, Francis R. Bach:
On a Variance Reduction Correction of the Temporal Difference for Policy Evaluation in the Stochastic Continuous Setting. CoRR abs/2202.07960 (2022) - [i168]Francis R. Bach:
Information Theory with Kernel Methods. CoRR abs/2202.08545 (2022) - [i167]Blake E. Woodworth, Francis R. Bach, Alessandro Rudi:
Non-Convex Optimization with Certificates and Fast Rates Through Kernel Sums of Squares. CoRR abs/2204.04970 (2022) - [i166]Hadi Daneshmand, Francis R. Bach:
Polynomial-time sparse measure recovery. CoRR abs/2204.07879 (2022) - [i165]Benjamin Dubois-Taine, Francis R. Bach, Quentin Berthet, Adrien B. Taylor:
Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization. CoRR abs/2205.12751 (2022) - [i164]Céline Moucer, Adrien B. Taylor, Francis R. Bach:
A systematic approach to Lyapunov analyses of continuous-time models in convex optimization. CoRR abs/2205.12772 (2022) - [i163]Amir Joudaki, Hadi Daneshmand, Francis R. Bach:
Entropy Maximization with Depth: A Variational Principle for Random Neural Networks. CoRR abs/2205.13076 (2022) - [i162]Vivien Cabannes, Francis R. Bach, Vianney Perchet, Alessandro Rudi:
Active Labeling: Streaming Stochastic Gradients. CoRR abs/2205.13255 (2022) - [i161]Marc Lambert, Sinho Chewi, Francis R. Bach, Silvère Bonnabel, Philippe Rigollet:
Variational inference via Wasserstein gradient flows. CoRR abs/2205.15902 (2022) - [i160]Antonio Orvieto, Anant Raj, Hans Kersting, Francis R. Bach:
Explicit Regularization in Overparametrized Models via Noise Injection. CoRR abs/2206.04613 (2022) - [i159]Konstantin Mishchenko, Francis R. Bach, Mathieu Even, Blake E. Woodworth:
Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays. CoRR abs/2206.07638 (2022) - [i158]Francis R. Bach:
Sum-of-Squares Relaxations for Information Theory and Variational Inference. CoRR abs/2206.13285 (2022) - [i157]Aurélien Lucchi, Frank Proske, Antonio Orvieto, Francis R. Bach, Hans Kersting:
On the Theoretical Properties of Noise Correlation in Stochastic Optimization. CoRR abs/2209.09162 (2022) - [i156]Lawrence Stewart, Francis R. Bach, Quentin Berthet, Jean-Philippe Vert:
Regression as Classification: Influence of Task Formulation on Neural Network Features. CoRR abs/2211.05641 (2022) - 2021
- [j60]Robert M. Gower
, Peter Richtárik
, Francis R. Bach:
Stochastic quasi-gradient methods: variance reduction via Jacobian sketching. Math. Program. 188(1): 135-192 (2021) - [j59]Hadrien Hendrikx, Francis R. Bach, Laurent Massoulié:
An Optimal Algorithm for Decentralized Finite-Sum Optimization. SIAM J. Optim. 31(4): 2753-2783 (2021) - [j58]Francis R. Bach:
On the Effectiveness of Richardson Extrapolation in Data Science. SIAM J. Math. Data Sci. 3(4): 1251-1277 (2021) - [c171]Anant Raj, Francis R. Bach:
Explicit Regularization of Stochastic Gradient Methods through Duality. AISTATS 2021: 1882-1890 - [c170]Vivien A. Cabannes, Francis R. Bach, Alessandro Rudi:
Fast Rates for Structured Prediction. COLT 2021: 823-865 - [c169]Adrien Vacher, Boris Muzellec, Alessandro Rudi, Francis R. Bach, François-Xavier Vialard:
A Dimension-free Computational Upper-bound for Smooth Optimal Transport Estimation. COLT 2021: 4143-4173 - [c168]Eloïse Berthier
, Justin Carpentier, Francis R. Bach:
Fast and Robust Stability Region Estimation for Nonlinear Dynamical Systems. ECC 2021: 1412-1419 - [c167]Alberto Bietti, Francis R. Bach:
Deep Equals Shallow for ReLU Networks in Kernel Regimes. ICLR 2021 - [c166]Vivien A. Cabannes, Francis R. Bach, Alessandro Rudi:
Disambiguation of Weak Supervision leading to Exponential Convergence rates. ICML 2021: 1147-1157 - [c165]Hadi Daneshmand, Amir Joudaki, Francis R. Bach:
Batch Normalization Orthogonalizes Representations in Deep Random Networks. NeurIPS 2021: 4896-4906 - [c164]Mathieu Even, Raphaël Berthier, Francis R. Bach, Nicolas Flammarion, Hadrien Hendrikx, Pierre Gaillard, Laurent Massoulié, Adrien B. Taylor:
Continuized Accelerations of Deterministic and Stochastic Gradient Descents, and of Gossip Algorithms. NeurIPS 2021: 28054-28066 - [c163]Vivien Cabannes, Loucas Pillaud-Vivien, Francis R. Bach, Alessandro Rudi:
Overcoming the curse of dimensionality with Laplacian regularization in semi-supervised learning. NeurIPS 2021: 30439-30451 - [i155]Vivien Cabannes, Alessandro Rudi, Francis R. Bach:
Fast rates in structured prediction. CoRR abs/2102.00760 (2021) - [i154]Vivien Cabannes, Francis R. Bach, Alessandro Rudi:
Disambiguation of weak supervision with exponential convergence rates. CoRR abs/2102.02789 (2021) - [i153]Raphaël Berthier, Francis R. Bach, Nicolas Flammarion, Pierre Gaillard, Adrien B. Taylor:
A Continuized View on Nesterov Acceleration. CoRR abs/2102.06035 (2021) - [i152]Alex Nowak-Vila, Alessandro Rudi, Francis R. Bach:
Max-Margin is Dead, Long Live Max-Margin! CoRR abs/2105.15069 (2021) - [i151]Hadi Daneshmand, Amir Joudaki, Francis R. Bach:
Batch Normalization Orthogonalizes Representations in Deep Random Networks. CoRR abs/2106.03970 (2021) - [i150]Mathieu Even, Raphaël Berthier, Francis R. Bach, Nicolas Flammarion, Pierre Gaillard, Hadrien Hendrikx, Laurent Massoulié, Adrien B. Taylor:
A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip. CoRR abs/2106.07644 (2021) - [i149]Boris Muzellec, Francis R. Bach, Alessandro Rudi:
A Note on Optimizing Distributions using Kernel Mean Embeddings. CoRR abs/2106.09994 (2021) - [i148]Yifan Sun, Francis R. Bach:
Screening for a Reweighted Penalized Conditional Gradient Method. CoRR abs/2107.01106 (2021) - [i147]Francis R. Bach, Lenaïc Chizat:
Gradient Descent on Infinitely Wide Neural Networks: Global Convergence and Generalization. CoRR abs/2110.08084 (2021) - [i146]Ulysse Marteau-Ferey, Francis R. Bach, Alessandro Rudi:
Sampling from Arbitrary Functions via PSD Models. CoRR abs/2110.10527 (2021) - [i145]Anant Raj, Francis R. Bach:
Convergence of Uncertainty Sampling for Active Learning. CoRR abs/2110.15784 (2021) - [i144]Boris Muzellec, Francis R. Bach, Alessandro Rudi:
Learning PSD-valued functions using kernel sums-of-squares. CoRR abs/2111.11306 (2021) - [i143]Boris Muzellec, Adrien Vacher, Francis R. Bach, François-Xavier Vialard, Alessandro Rudi:
Near-optimal estimation of smooth transport maps with kernel sums-of-squares. CoRR abs/2112.01907 (2021) - 2020
- [j57]Eloïse Berthier
, Francis R. Bach:
Max-Plus Linear Approximations for Deterministic Continuous-State Markov Decision Processes. IEEE Control. Syst. Lett. 4(3): 767-772 (2020) - [j56]Damien Scieur, Alexandre d'Aspremont, Francis R. Bach:
Regularized nonlinear acceleration. Math. Program. 179(1): 47-83 (2020) - [j55]Robert M. Gower
, Mark Schmidt
, Francis R. Bach, Peter Richtárik
:
Variance-Reduced Methods for Machine Learning. Proc. IEEE 108(11): 1968-1983 (2020) - [j54]Raphaël Berthier, Francis R. Bach, Pierre Gaillard:
Accelerated Gossip in Networks of Given Dimension Using Jacobi Polynomial Iterations. SIAM J. Math. Data Sci. 2(1): 24-47 (2020) - [c162]Lénaïc Chizat, Francis R. Bach:
Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss. COLT 2020: 1305-1338 - [c161]Marin Ballu, Quentin Berthet, Francis R. Bach:
Stochastic Optimization for Regularized Wasserstein Estimators. ICML 2020: 602-612 - [c160]Vivien Cabannes, Alessandro Rudi, Francis R. Bach:
Structured Prediction with Partial Labelling through the Infimum Loss. ICML 2020: 1230-1239 - [c159]Hadrien Hendrikx, Lin Xiao, Sébastien Bubeck, Francis R. Bach, Laurent Massoulié:
Statistically Preconditioned Accelerated Gradient Method for Distributed Optimization. ICML 2020: 4203-4227 - [c158]Alex Nowak, Francis R. Bach, Alessandro Rudi:
Consistent Structured Prediction with Max-Min Margin Markov Networks. ICML 2020: 7381-7391 - [c157]Raman Sankaran, Francis R. Bach, Chiranjib Bhattacharyya:
Learning With Subquadratic Regularization : A Primal-Dual Approach. IJCAI 2020: 1963-1969 - [c156]Quentin Berthet, Mathieu Blondel, Olivier Teboul, Marco Cuturi, Jean-Philippe Vert, Francis R. Bach:
Learning with Differentiable Pertubed Optimizers. NeurIPS 2020 - [c155]Raphaël Berthier, Francis R. Bach, Pierre Gaillard:
Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model. NeurIPS 2020 - [c154]Hadi Daneshmand, Jonas Moritz Kohler, Francis R. Bach, Thomas Hofmann, Aurélien Lucchi:
Batch normalization provably avoids ranks collapse for randomly initialised deep networks. NeurIPS 2020 - [c153]Hadrien Hendrikx, Francis R. Bach, Laurent Massoulié:
Dual-Free Stochastic Decentralized Optimization with Variance Reduction. NeurIPS 2020 - [c152]Ulysse Marteau-Ferey, Francis R. Bach, Alessandro Rudi:
Non-parametric Models for Non-negative Functions. NeurIPS 2020 - [i142]Francis R. Bach:
On the Effectiveness of Richardson Extrapolation in Machine Learning. CoRR abs/2002.02835 (2020) - [i141]Lénaïc Chizat, Francis R. Bach:
Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss. CoRR abs/2002.04486 (2020) - [i140]Quentin Berthet, Mathieu Blondel, Olivier Teboul, Marco Cuturi, Jean-Philippe Vert, Francis R. Bach:
Learning with Differentiable Perturbed Optimizers. CoRR abs/2002.08676 (2020) - [i139]Marin Ballu, Quentin Berthet, Francis R. Bach:
Stochastic Optimization for Regularized Wasserstein Estimators. CoRR abs/2002.08695 (2020) - [i138]Yifan Sun, Francis R. Bach:
Safe Screening for the Generalized Conditional Gradient Method. CoRR abs/2002.09718 (2020) - [i137]Hadrien Hendrikx, Lin Xiao, Sébastien Bubeck, Francis R. Bach, Laurent Massoulié:
Statistically Preconditioned Accelerated Gradient Method for Distributed Optimization. CoRR abs/2002.10726 (2020) - [i136]Vivien Cabannes, Alessandro Rudi, Francis R. Bach:
Structured Prediction with Partial Labelling through the Infimum Loss. CoRR abs/2003.00920 (2020) - [i135]Hadi Daneshmand, Jonas Moritz Kohler, Francis R. Bach, Thomas Hofmann, Aurélien Lucchi:
Theoretical Understanding of Batch-normalization: A Markov Chain Perspective. CoRR abs/2003.01652 (2020) - [i134]Alexandre Défossez, Léon Bottou, Francis R. Bach, Nicolas Usunier:
On the Convergence of Adam and Adagrad. CoRR abs/2003.02395 (2020) - [i133]Anant Raj, Francis R. Bach:
Explicit Regularization of Stochastic Gradient Methods through Duality. CoRR abs/2003.13807 (2020) - [i132]Hadrien Hendrikx, Francis R. Bach, Laurent Massoulié:
An Optimal Algorithm for Decentralized Finite Sum Optimization. CoRR abs/2005.10675 (2020) - [i131]Théo Ryffel, David Pointcheval, Francis R. Bach:
ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing. CoRR abs/2006.04593 (2020) - [i130]Mathieu Barré, Adrien B. Taylor, Francis R. Bach:
Principled Analyses and Design of First-Order Methods with Inexact Proximal Operators. CoRR abs/2006.06041 (2020) - [i129]Raphaël Berthier, Francis R. Bach, Pierre Gaillard:
Tight Nonparametric Convergence Rates for Stochastic Gradient Descent under the Noiseless Linear Model. CoRR abs/2006.08212 (2020) - [i128]Thomas Eboli, Alex Nowak-Vila, Jian Sun, Francis R. Bach, Jean Ponce, Alessandro Rudi:
Structured and Localized Image Restoration. CoRR abs/2006.09261 (2020) - [i127]Hadrien Hendrikx, Francis R. Bach, Laurent Massoulié:
Dual-Free Stochastic Decentralized Optimization with Variance Reduction. CoRR abs/2006.14384 (2020) - [i126]Alex Nowak-Vila, Francis R. Bach, Alessandro Rudi:
Consistent Structured Prediction with Max-Min Margin Markov Networks. CoRR abs/2007.01012 (2020) - [i125]Ulysse Marteau-Ferey, Francis R. Bach, Alessandro Rudi:
Non-parametric Models for Non-negative Functions. CoRR abs/2007.03926 (2020) - [i124]Alberto Bietti, Francis R. Bach:
Deep Equals Shallow for ReLU Networks in Kernel Regimes. CoRR abs/2009.14397 (2020) - [i123]Robert M. Gower, Mark Schmidt, Francis R. Bach, Peter Richtárik
:
Variance-Reduced Methods for Machine Learning. CoRR abs/2010.00892 (2020) - [i122]Alessandro Rudi, Ulysse Marteau-Ferey, Francis R. Bach:
Finding Global Minima via Kernel Approximations. CoRR abs/2012.11978 (2020)
2010 – 2019
- 2019
- [j53]Kevin Scaman, Francis R. Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié:
Optimal Convergence Rates for Convex Distributed Optimization in Networks. J. Mach. Learn. Res. 20: 159:1-159:31 (2019) - [j52]Francis R. Bach:
Submodular functions: from discrete to continuous domains. Math. Program. 175(1-2): 419-459 (2019) - [j51]Lucas Rencker
, Francis R. Bach, Wenwu Wang
, Mark D. Plumbley
:
Sparse Recovery and Dictionary Learning From Nonlinear Compressive Measurements. IEEE Trans. Signal Process. 67(21): 5659-5670 (2019) - [c151]Hadrien Hendrikx, Francis R. Bach, Laurent Massoulié:
Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives. AISTATS 2019: 897-906 - [c150]Sharan Vaswani, Francis R. Bach, Mark Schmidt:
Fast and Faster Convergence of SGD for Over-Parameterized Models and an Accelerated Perceptron. AISTATS 2019: 1195-1204 - [c149]Pierre Ablin, Alexandre Gramfort, Jean-François Cardoso, Francis R. Bach:
Stochastic algorithms with descent guarantees for ICA. AISTATS 2019: 1564-1573 - [c148]Aude Genevay, Lénaïc Chizat, Francis R. Bach, Marco Cuturi, Gabriel Peyré:
Sample Complexity of Sinkhorn Divergences. AISTATS 2019: 1574-1583 - [c147]Alex Nowak-Vila, Francis R. Bach, Alessandro Rudi:
Sharp Analysis of Learning with Discrete Losses. AISTATS 2019: 1920-1929 - [c146]Anastasia Podosinnikova, Amelia Perry, Alexander S. Wein, Francis R. Bach, Alexandre d'Aspremont, David A. Sontag:
Overcomplete Independent Component Analysis via SDP. AISTATS 2019: 2583-2592 - [c145]Francis R. Bach, Kfir Y. Levy:
A Universal Algorithm for Variational Inequalities Adaptive to Smoothness and Noise. COLT 2019: 164-194 - [c144]Ulysse Marteau-Ferey, Dmitrii Ostrovskii, Francis R. Bach, Alessandro Rudi:
Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance. COLT 2019: 2294-2340 - [c143]Adrien B. Taylor, Francis R. Bach:
Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions. COLT 2019: 2934-2992 - [c142]Huy V. Vo, Francis R. Bach, Minsu Cho, Kai Han, Yann LeCun, Patrick Pérez, Jean Ponce:
Unsupervised Image Matching and Object Discovery as Optimization. CVPR 2019: 8287-8296 - [c141]Tatiana Shpakova, Francis R. Bach, Mike E. Davies:
Hyper-parameter Learning for Sparse Structured Probabilistic Models. ICASSP 2019: 3347-3351 - [c140]Othmane Sebbouh, Nidham Gazagnadou, Samy Jelassi, Francis R. Bach, Robert M. Gower:
Towards closing the gap between the theory and practice of SVRG. NeurIPS 2019: 646-656 - [c139]Hadrien Hendrikx, Francis R. Bach, Laurent Massoulié:
An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums. NeurIPS 2019: 952-962 - [c138]Lénaïc Chizat, Edouard Oyallon, Francis R. Bach:
On Lazy Training in Differentiable Programming. NeurIPS 2019: 2933-2943 - [c137]