


Остановите войну!
for scientists:


default search action
Samuel S. Schoenholz
Person information

Refine list

refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2022
- [j1]Sean Mann
, Eric Fadel, Samuel S. Schoenholz, Ekin D. Cubuk, Steven G. Johnson, Giuseppe Romano
:
∂PV: An end-to-end differentiable solar-cell simulator. Comput. Phys. Commun. 272: 108232 (2022) - [c23]Atish Agarwala, Samuel S. Schoenholz:
Deep equilibrium networks are sensitive to initialization statistics. ICML 2022: 136-160 - [c22]Roman Novak, Jascha Sohl-Dickstein, Samuel S. Schoenholz:
Fast Finite Width Neural Tangent Kernel. ICML 2022: 17018-17044 - [i27]Roman Novak, Jascha Sohl-Dickstein, Samuel S. Schoenholz:
Fast Finite Width Neural Tangent Kernel. CoRR abs/2206.08720 (2022) - [i26]Atish Agarwala, Samuel S. Schoenholz:
Deep equilibrium networks are sensitive to initialization statistics. CoRR abs/2207.09432 (2022) - [i25]Stanislav Fort, Ekin Dogus Cubuk, Surya Ganguli, Samuel S. Schoenholz:
What does a deep neural network confidently perceive? The effective dimension of high certainty class manifolds and their low confidence boundaries. CoRR abs/2210.05546 (2022) - 2021
- [c21]Amil Merchant, Luke Metz, Samuel S. Schoenholz, Ekin D. Cubuk:
Learn2Hop: Learned Optimization on Rough Landscapes. ICML 2021: 7643-7653 - [c20]Miguel Ruiz-Garcia, Ge Zhang, Samuel S. Schoenholz, Andrea J. Liu:
Tilting the playing field: Dynamical loss functions for machine learning. ICML 2021: 9157-9167 - [c19]Neha S. Wadia, Daniel Duckworth, Samuel S. Schoenholz, Ethan Dyer, Jascha Sohl-Dickstein:
Whitening and Second Order Optimization Both Make Information in the Dataset Unusable During Training, and Can Reduce or Prevent Generalization. ICML 2021: 10617-10629 - [i24]Miguel Ruiz-Garcia, Ge Zhang, Samuel S. Schoenholz, Andrea J. Liu:
Tilting the playing field: Dynamical loss functions for machine learning. CoRR abs/2102.03793 (2021) - [i23]Amil Merchant, Luke Metz, Samuel S. Schoenholz, Ekin Dogus Cubuk:
Learn2Hop: Learned Optimization on Rough Landscapes. CoRR abs/2107.09661 (2021) - [i22]James Martens, Andy Ballard, Guillaume Desjardins, Grzegorz Swirszcz, Valentin Dalibard, Jascha Sohl-Dickstein, Samuel S. Schoenholz:
Rapid training of deep neural networks without skip connections or normalization layers using Deep Kernel Shaping. CoRR abs/2110.01765 (2021) - [i21]Luke Metz, C. Daniel Freeman, Samuel S. Schoenholz, Tal Kachman:
Gradients are Not All You Need. CoRR abs/2111.05803 (2021) - 2020
- [c18]Roman Novak, Lechao Xiao, Jiri Hron, Jaehoon Lee, Alexander A. Alemi, Jascha Sohl-Dickstein, Samuel S. Schoenholz:
Neural Tangents: Fast and Easy Infinite Neural Networks in Python. ICLR 2020 - [c17]Jaehoon Lee, Samuel S. Schoenholz, Jeffrey Pennington, Ben Adlam, Lechao Xiao, Roman Novak, Jascha Sohl-Dickstein:
Finite Versus Infinite Neural Networks: an Empirical Study. NeurIPS 2020 - [c16]Samuel S. Schoenholz, Ekin Dogus Cubuk:
JAX MD: A Framework for Differentiable Physics. NeurIPS 2020 - [i20]Jascha Sohl-Dickstein, Roman Novak, Samuel S. Schoenholz, Jaehoon Lee:
On the infinite width limit of neural networks with a standard parameterization. CoRR abs/2001.07301 (2020) - [i19]Jaehoon Lee, Samuel S. Schoenholz, Jeffrey Pennington, Ben Adlam, Lechao Xiao, Roman Novak, Jascha Sohl-Dickstein:
Finite Versus Infinite Neural Networks: an Empirical Study. CoRR abs/2007.15801 (2020) - [i18]Neha S. Wadia, Daniel Duckworth, Samuel S. Schoenholz, Ethan Dyer, Jascha Sohl-Dickstein:
Whitening and second order optimization both destroy information about the dataset, and can make generalization impossible. CoRR abs/2008.07545 (2020) - [i17]Atish Agarwala, Jeffrey Pennington, Yann N. Dauphin, Samuel S. Schoenholz:
Temperature check: theory and practice for training models with softmax-cross-entropy losses. CoRR abs/2010.07344 (2020)
2010 – 2019
- 2019
- [c15]Greg Yang, Jeffrey Pennington, Vinay Rao, Jascha Sohl-Dickstein, Samuel S. Schoenholz:
A Mean Field Theory of Batch Normalization. ICLR (Poster) 2019 - [c14]Jaehoon Lee, Lechao Xiao, Samuel S. Schoenholz, Yasaman Bahri, Roman Novak, Jascha Sohl-Dickstein, Jeffrey Pennington:
Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent. NeurIPS 2019: 8570-8581 - [c13]Yann N. Dauphin, Samuel S. Schoenholz:
MetaInit: Initializing learning by learning to initialize. NeurIPS 2019: 12624-12636 - [i16]Dar Gilboa, Bo Chang, Minmin Chen, Greg Yang, Samuel S. Schoenholz, Ed H. Chi, Jeffrey Pennington:
Dynamical Isometry and a Mean Field Theory of LSTMs and GRUs. CoRR abs/1901.08987 (2019) - [i15]Jaehoon Lee, Lechao Xiao, Samuel S. Schoenholz, Yasaman Bahri, Jascha Sohl-Dickstein, Jeffrey Pennington:
Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent. CoRR abs/1902.06720 (2019) - [i14]Greg Yang, Jeffrey Pennington, Vinay Rao, Jascha Sohl-Dickstein, Samuel S. Schoenholz:
A Mean Field Theory of Batch Normalization. CoRR abs/1902.08129 (2019) - [i13]Roman Novak, Lechao Xiao, Jiri Hron, Jaehoon Lee, Alexander A. Alemi, Jascha Sohl-Dickstein, Samuel S. Schoenholz:
Neural Tangents: Fast and Easy Infinite Neural Networks in Python. CoRR abs/1912.02803 (2019) - [i12]Lechao Xiao, Jeffrey Pennington, Samuel S. Schoenholz:
Disentangling trainability and generalization in deep learning. CoRR abs/1912.13053 (2019) - 2018
- [c12]Jeffrey Pennington, Samuel S. Schoenholz, Surya Ganguli:
The emergence of spectral universality in deep networks. AISTATS 2018: 1924-1932 - [c11]Ekin Dogus Cubuk, Barret Zoph, Samuel S. Schoenholz, Quoc V. Le:
Intriguing Properties of Adversarial Examples. ICLR (Workshop) 2018 - [c10]Justin Gilmer, Luke Metz, Fartash Faghri, Samuel S. Schoenholz, Maithra Raghu, Martin Wattenberg, Ian J. Goodfellow:
Adversarial Spheres. ICLR (Workshop) 2018 - [c9]Jaehoon Lee, Yasaman Bahri, Roman Novak, Samuel S. Schoenholz, Jeffrey Pennington, Jascha Sohl-Dickstein:
Deep Neural Networks as Gaussian Processes. ICLR (Poster) 2018 - [c8]Minmin Chen, Jeffrey Pennington, Samuel S. Schoenholz:
Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks. ICML 2018: 872-881 - [c7]Lechao Xiao, Yasaman Bahri, Jascha Sohl-Dickstein, Samuel S. Schoenholz, Jeffrey Pennington:
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10, 000-Layer Vanilla Convolutional Neural Networks. ICML 2018: 5389-5398 - [i11]Justin Gilmer, Luke Metz, Fartash Faghri, Samuel S. Schoenholz, Maithra Raghu, Martin Wattenberg, Ian J. Goodfellow:
Adversarial Spheres. CoRR abs/1801.02774 (2018) - [i10]Jeffrey Pennington, Samuel S. Schoenholz, Surya Ganguli:
The Emergence of Spectral Universality in Deep Networks. CoRR abs/1802.09979 (2018) - [i9]Lechao Xiao, Yasaman Bahri, Jascha Sohl-Dickstein, Samuel S. Schoenholz, Jeffrey Pennington:
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10, 000-Layer Vanilla Convolutional Neural Networks. CoRR abs/1806.05393 (2018) - [i8]Minmin Chen, Jeffrey Pennington, Samuel S. Schoenholz:
Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks. CoRR abs/1806.05394 (2018) - 2017
- [c6]Justin Gilmer, Colin Raffel, Samuel S. Schoenholz, Maithra Raghu, Jascha Sohl-Dickstein:
Explaining the Learning Dynamics of Direct Feedback Alignment. ICLR (Workshop) 2017 - [c5]Samuel S. Schoenholz, Justin Gilmer, Surya Ganguli, Jascha Sohl-Dickstein:
Deep Information Propagation. ICLR (Poster) 2017 - [c4]Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, George E. Dahl:
Neural Message Passing for Quantum Chemistry. ICML 2017: 1263-1272 - [c3]Jeffrey Pennington, Samuel S. Schoenholz, Surya Ganguli:
Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice. NIPS 2017: 4785-4795 - [c2]Greg Yang, Samuel S. Schoenholz:
Mean Field Residual Networks: On the Edge of Chaos. NIPS 2017: 7103-7114 - [i7]Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, George E. Dahl:
Neural Message Passing for Quantum Chemistry. CoRR abs/1704.01212 (2017) - [i6]Samuel S. Schoenholz, Jeffrey Pennington, Jascha Sohl-Dickstein:
A Correspondence Between Random Neural Networks and Statistical Field Theory. CoRR abs/1710.06570 (2017) - [i5]Jaehoon Lee, Yasaman Bahri, Roman Novak, Samuel S. Schoenholz, Jeffrey Pennington, Jascha Sohl-Dickstein:
Deep Neural Networks as Gaussian Processes. CoRR abs/1711.00165 (2017) - [i4]Ekin Dogus Cubuk, Barret Zoph, Samuel S. Schoenholz, Quoc V. Le:
Intriguing Properties of Adversarial Examples. CoRR abs/1711.02846 (2017) - [i3]Jeffrey Pennington, Samuel S. Schoenholz, Surya Ganguli:
Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice. CoRR abs/1711.04735 (2017) - [i2]Greg Yang, Samuel S. Schoenholz:
Mean Field Residual Networks: On the Edge of Chaos. CoRR abs/1712.08969 (2017) - 2016
- [i1]Samuel S. Schoenholz, Justin Gilmer, Surya Ganguli, Jascha Sohl-Dickstein:
Deep Information Propagation. CoRR abs/1611.01232 (2016)
2000 – 2009
- 2009
- [c1]M. Ani Hsieh, Ádám M. Halász, Ekin Dogus Cubuk, Samuel S. Schoenholz, Alcherio Martinoli
:
Specialization as an optimal strategy under varying external conditions. ICRA 2009: 1941-1946
Coauthor Index

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
load content from web.archive.org
Privacy notice: By enabling the option above, your browser will contact the API of web.archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from ,
, and
to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and
to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2022-12-08 23:21 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint