
Daniel Soudry
Refine list

refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2020
- [j12]Zhihui Zhu
, Daniel Soudry, Yonina C. Eldar
, Michael B. Wakin:
The Global Optimization Geometry of Shallow Linear Neural Networks. J. Math. Imaging Vis. 62(3): 279-292 (2020) - [c25]Blake E. Woodworth, Suriya Gunasekar, Jason D. Lee, Edward Moroshko, Pedro Savarese, Itay Golan, Daniel Soudry, Nathan Srebro:
Kernel and Rich Regimes in Overparametrized Models. COLT 2020: 3635-3673 - [c24]Elad Hoffer, Tal Ben-Nun, Itay Hubara, Niv Giladi, Torsten Hoefler, Daniel Soudry:
Augment Your Batch: Improving Generalization Through Instance Repetition. CVPR 2020: 8126-8135 - [c23]Matan Haroush, Itay Hubara, Elad Hoffer, Daniel Soudry:
The Knowledge Within: Methods for Data-Free Model Compression. CVPR 2020: 8491-8499 - [c22]Niv Giladi, Mor Shpigel Nacson, Elad Hoffer, Daniel Soudry:
At Stability's Edge: How to Adjust Hyperparameters to Preserve Minima Selection in Asynchronous Training of Neural Networks? ICLR 2020 - [c21]Greg Ongie, Rebecca Willett, Daniel Soudry, Nathan Srebro:
A Function Space View of Bounded Norm Infinite Width ReLU Nets: The Multivariate Case. ICLR 2020 - [c20]Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry:
Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization? ICML 2020: 960-969 - [c19]Edward Moroshko, Blake E. Woodworth, Suriya Gunasekar, Jason D. Lee, Nati Srebro, Daniel Soudry:
Implicit Bias in Deep Linear Classification: Initialization Scale vs Training Accuracy. NeurIPS 2020 - [i31]Blake E. Woodworth, Suriya Gunasekar, Jason D. Lee, Edward Moroshko, Pedro Savarese, Itay Golan, Daniel Soudry, Nathan Srebro:
Kernel and Rich Regimes in Overparametrized Models. CoRR abs/2002.09277 (2020) - [i30]Brian Chmiel, Liad Ben-Uri, Moran Shkolnik, Elad Hoffer, Ron Banner, Daniel Soudry:
Neural gradients are lognormally distributed: understanding sparse and quantized training. CoRR abs/2006.08173 (2020) - [i29]Itay Hubara, Yury Nahshan, Yair Hanani, Ron Banner, Daniel Soudry:
Improving Post Training Neural Quantization: Layer-wise Calibration and Integer Programming. CoRR abs/2006.10518 (2020) - [i28]Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry:
Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization? CoRR abs/2007.01038 (2020) - [i27]Edward Moroshko, Suriya Gunasekar, Blake E. Woodworth, Jason D. Lee, Nathan Srebro, Daniel Soudry:
Implicit Bias in Deep Linear Classification: Initialization Scale vs Training Accuracy. CoRR abs/2007.06738 (2020) - [i26]Chen Zeno, Itay Golan, Elad Hoffer, Daniel Soudry:
Task Agnostic Continual Learning Using Online Variational Bayes with Fixed-Point Updates. CoRR abs/2010.00373 (2020)
2010 – 2019
- 2019
- [c18]Mor Shpigel Nacson, Nathan Srebro, Daniel Soudry:
Stochastic Gradient Descent on Separable Data: Exact Convergence with a Fixed Learning Rate. AISTATS 2019: 3051-3059 - [c17]Mor Shpigel Nacson, Jason D. Lee, Suriya Gunasekar, Pedro Henrique Pamplona Savarese, Nathan Srebro, Daniel Soudry:
Convergence of Gradient Descent on Separable Data. AISTATS 2019: 3420-3428 - [c16]Pedro Savarese, Itay Evron, Daniel Soudry, Nathan Srebro:
How do infinite width bounded norm networks look in function space? COLT 2019: 2667-2690 - [c15]Mor Shpigel Nacson, Suriya Gunasekar, Jason D. Lee, Nathan Srebro, Daniel Soudry:
Lexicographic and Depth-Sensitive Margins in Homogeneous and Non-Homogeneous Deep Models. ICML 2019: 4683-4692 - [c14]Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry:
A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth Trade-Off. NeurIPS 2019: 7036-7046 - [c13]Ron Banner, Yury Nahshan, Daniel Soudry:
Post training 4-bit quantization of convolutional networks for rapid-deployment. NeurIPS 2019: 7948-7956 - [i25]Elad Hoffer, Tal Ben-Nun, Itay Hubara, Niv Giladi, Torsten Hoefler, Daniel Soudry:
Augment your batch: better training with larger batches. CoRR abs/1901.09335 (2019) - [i24]Pedro Savarese, Itay Evron, Daniel Soudry, Nathan Srebro:
How do infinite width bounded norm networks look in function space? CoRR abs/1902.05040 (2019) - [i23]Mor Shpigel Nacson, Suriya Gunasekar, Jason D. Lee, Nathan Srebro, Daniel Soudry:
Lexicographic and Depth-Sensitive Margins in Homogeneous and Non-Homogeneous Deep Models. CoRR abs/1905.07325 (2019) - [i22]Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry:
A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth Trade-Off. CoRR abs/1906.00771 (2019) - [i21]Elad Hoffer, Berry Weinstein, Itay Hubara, Tal Ben-Nun, Torsten Hoefler, Daniel Soudry:
Mix & Match: training convnets with mixed image sizes for improved accuracy, speed and scale resiliency. CoRR abs/1908.08986 (2019) - [i20]Niv Giladi, Mor Shpigel Nacson, Elad Hoffer, Daniel Soudry:
At Stability's Edge: How to Adjust Hyperparameters to Preserve Minima Selection in Asynchronous Training of Neural Networks? CoRR abs/1909.12340 (2019) - [i19]Greg Ongie, Rebecca Willett, Daniel Soudry, Nathan Srebro:
A Function Space View of Bounded Norm Infinite Width ReLU Nets: The Multivariate Case. CoRR abs/1910.01635 (2019) - [i18]Matan Haroush, Itay Hubara, Elad Hoffer, Daniel Soudry:
The Knowledge Within: Methods for Data-Free Model Compression. CoRR abs/1912.01274 (2019) - [i17]Tzofnat Greenberg-Toledo, Ben Perach, Daniel Soudry, Shahar Kvatinsky:
MTJ-Based Hardware Synapse Design for Quantized Deep Neural Networks. CoRR abs/1912.12636 (2019) - 2018
- [j11]Daniel Soudry, Elad Hoffer, Mor Shpigel Nacson, Suriya Gunasekar, Nathan Srebro:
The Implicit Bias of Gradient Descent on Separable Data. J. Mach. Learn. Res. 19: 70:1-70:57 (2018) - [j10]Philippa J. Karoly
, Levin Kuhlmann, Daniel Soudry, David B. Grayden
, Mark J. Cook
, Dean R. Freestone
:
Seizure pathways: A model-based investigation. PLoS Comput. Biol. 14(10) (2018) - [c12]Elad Hoffer, Itay Hubara, Daniel Soudry:
Fix your classifier: the marginal value of training the last weight layer. ICLR (Poster) 2018 - [c11]Daniel Soudry, Elad Hoffer:
Exponentially vanishing sub-optimal local minima in multilayer neural networks. ICLR (Workshop) 2018 - [c10]Daniel Soudry, Elad Hoffer, Mor Shpigel Nacson, Nathan Srebro:
The Implicit Bias of Gradient Descent on Separable Data. ICLR (Poster) 2018 - [c9]Suriya Gunasekar, Jason D. Lee, Daniel Soudry, Nathan Srebro:
Characterizing Implicit Bias in Terms of Optimization Geometry. ICML 2018: 1827-1836 - [c8]Elad Hoffer, Ron Banner, Itay Golan, Daniel Soudry:
Norm matters: efficient and accurate normalization schemes in deep networks. NeurIPS 2018: 2164-2174 - [c7]Ron Banner, Itay Hubara, Elad Hoffer, Daniel Soudry:
Scalable methods for 8-bit training of neural networks. NeurIPS 2018: 5151-5159 - [c6]Suriya Gunasekar, Jason D. Lee, Daniel Soudry, Nati Srebro:
Implicit Bias of Gradient Descent on Linear Convolutional Networks. NeurIPS 2018: 9482-9491 - [i16]Elad Hoffer, Itay Hubara, Daniel Soudry:
Fix your classifier: the marginal value of training the last weight layer. CoRR abs/1801.04540 (2018) - [i15]Elad Hoffer, Shai Fine, Daniel Soudry:
On the Blindspots of Convolutional Networks. CoRR abs/1802.05187 (2018) - [i14]Suriya Gunasekar, Jason D. Lee, Daniel Soudry, Nathan Srebro:
Characterizing Implicit Bias in Terms of Optimization Geometry. CoRR abs/1802.08246 (2018) - [i13]Elad Hoffer, Ron Banner, Itay Golan, Daniel Soudry:
Norm matters: efficient and accurate normalization schemes in deep networks. CoRR abs/1803.01814 (2018) - [i12]Mor Shpigel Nacson, Jason D. Lee, Suriya Gunasekar, Nathan Srebro, Daniel Soudry:
Convergence of Gradient Descent on Separable Data. CoRR abs/1803.01905 (2018) - [i11]Chen Zeno, Itay Golan, Elad Hoffer, Daniel Soudry:
Bayesian Gradient Descent: Online Variational Bayes Learning with Increased Robustness to Catastrophic Forgetting and Weight Pruning. CoRR abs/1803.10123 (2018) - [i10]Zhihui Zhu, Daniel Soudry, Yonina C. Eldar, Michael B. Wakin:
The Global Optimization Geometry of Shallow Linear Neural Networks. CoRR abs/1805.04938 (2018) - [i9]Ron Banner, Itay Hubara, Elad Hoffer, Daniel Soudry:
Scalable Methods for 8-bit Training of Neural Networks. CoRR abs/1805.11046 (2018) - [i8]Suriya Gunasekar, Jason D. Lee, Daniel Soudry, Nathan Srebro:
Implicit Bias of Gradient Descent on Linear Convolutional Networks. CoRR abs/1806.00468 (2018) - [i7]Mor Shpigel Nacson, Nathan Srebro, Daniel Soudry:
Stochastic Gradient Descent on Separable Data: Exact Convergence with a Fixed Learning Rate. CoRR abs/1806.01796 (2018) - [i6]Ron Banner, Yury Nahshan, Elad Hoffer, Daniel Soudry:
ACIQ: Analytical Clipping for Integer Quantization of neural networks. CoRR abs/1810.05723 (2018) - 2017
- [j9]Itay Hubara, Matthieu Courbariaux, Daniel Soudry, Ran El-Yaniv, Yoshua Bengio:
Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations. J. Mach. Learn. Res. 18: 187:1-187:30 (2017) - [j8]Johannes Friedrich
, Weijian Yang
, Daniel Soudry, Yu Mu
, Misha Ahrens, Rafael Yuste, Darcy S. Peterka, Liam Paninski:
Multi-scale approaches for high-speed imaging and analysis of large neural populations. PLoS Comput. Biol. 13(8) (2017) - [c5]Elad Hoffer, Itay Hubara, Daniel Soudry:
Train longer, generalize better: closing the generalization gap in large batch training of neural networks. NIPS 2017: 1731-1741 - [i5]Elad Hoffer, Itay Hubara, Daniel Soudry:
Train longer, generalize better: closing the generalization gap in large batch training of neural networks. CoRR abs/1705.08741 (2017) - [i4]Daniel Soudry, Elad Hoffer, Nathan Srebro:
The Implicit Bias of Gradient Descent on Separable Data. CoRR abs/1710.10345 (2017) - 2016
- [c4]Eyal Rosenthal, Sergey Greshnikov, Daniel Soudry, Shahar Kvatinsky:
A fully analog memristor-based neural network with online gradient training. ISCAS 2016: 1394-1397 - [c3]Itay Hubara, Matthieu Courbariaux, Daniel Soudry, Ran El-Yaniv, Yoshua Bengio:
Binarized Neural Networks. NIPS 2016: 4107-4115 - [i3]Daniel Soudry, Yair Carmon:
No bad local minima: Data independent training error guarantees for multilayer neural networks. CoRR abs/1605.08361 (2016) - [i2]Itay Hubara, Matthieu Courbariaux, Daniel Soudry, Ran El-Yaniv, Yoshua Bengio:
Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations. CoRR abs/1609.07061 (2016) - 2015
- [j7]Daniel Soudry, Suraj Keshri, Patrick Stinson, Min-hwan Oh, Garud Iyengar, Liam Paninski:
Efficient "Shotgun" Inference of Neural Connectivity from Highly Sub-sampled Activity Data. PLoS Comput. Biol. 11(10) (2015) - [j6]Daniel Soudry, Dotan Di Castro, Asaf Gal
, Avinoam Kolodny, Shahar Kvatinsky:
Memristor-Based Multilayer Neural Networks With Online Gradient Descent Training. IEEE Trans. Neural Networks Learn. Syst. 26(10): 2408-2421 (2015) - [i1]Zhiyong Cheng, Daniel Soudry, Zexi Mao, Zhen-zhong Lan:
Training Binary Multilayer Neural Networks for Image Classification using Expectation Backpropagation. CoRR abs/1503.03562 (2015) - 2014
- [j5]Daniel Soudry, Ron Meir:
The neuronal response at extended timescales: a linearized spiking input-output relation. Frontiers Comput. Neurosci. 8: 29 (2014) - [j4]Daniel Soudry, Ron Meir:
The neuronal response at extended timescales: long-term correlations without long-term memory. Frontiers Comput. Neurosci. 8: 35 (2014) - [j3]Danilo Pezo, Daniel Soudry, Patricio Orio
:
Diffusion approximation-based simulation of stochastic ion channels: which method to use? Frontiers Comput. Neurosci. 8: 139 (2014) - [c2]Daniel Soudry, Itay Hubara, Ron Meir:
Expectation Backpropagation: Parameter-Free Training of Multilayer Neural Networks with Continuous or Discrete Weights. NIPS 2014: 963-971 - 2012
- [j2]Daniel Soudry, Ron Meir:
Conductance-Based Neuron Models and the Slow Dynamics of Excitability. Frontiers Comput. Neurosci. 6: 4 (2012) - [c1]Dmitri B. Chklovskii, Daniel Soudry:
"Neuronal spike generation mechanism as an oversampling, noise-shaping A-to-D converter". NIPS 2012: 512-520 - 2010
- [j1]Daniel Soudry, Ron Meir:
History-Dependent Dynamics in a Generic Model of Ion Channels - An Analytic Study. Frontiers Comput. Neurosci. 4: 3 (2010)
Coauthor Index

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
load content from web.archive.org
Privacy notice: By enabling the option above, your browser will contact the API of web.archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from ,
, and
to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and
to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
Tweets on dblp homepage
Show tweets from on the dblp homepage.
Privacy notice: By enabling the option above, your browser will contact twitter.com and twimg.com to load tweets curated by our Twitter account. At the same time, Twitter will persistently store several cookies with your web browser. While we did signal Twitter to not track our users by setting the "dnt" flag, we do not have any control over how Twitter uses your data. So please proceed with care and consider checking the Twitter privacy policy.
last updated on 2021-01-23 00:51 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint