default search action
Andrew Gordon Wilson
Person information
- affiliation: New York University, New York, NY, USA
- affiliation (former): Cornell University, Ithaca, NY, USA
- affiliation (former): Carnegie Mellon University, Machine Learning Department, Pittsburgh, PA, USA
- affiliation (former): University of Cambridge, Department of Engineering, UK
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
showing all ?? records
2020 – today
- 2024
- [j3]Gianluca Detommaso, Alberto Gasparin, Michele Donini, Matthias W. Seeger, Andrew Gordon Wilson, Cédric Archambeau:
Fortuna: A Library for Uncertainty Quantification in Deep Learning. J. Mach. Learn. Res. 25: 238:1-238:7 (2024) - [c106]Tim G. J. Rudner, Ya Shi Zhang, Andrew Gordon Wilson, Julia Kempe:
Mind the GAP: Improving Robustness to Subpopulation Shifts with Group-Aware Priors. AISTATS 2024: 127-135 - [c105]Nate Gruver, Anuroop Sriram, Andrea Madotto, Andrew Gordon Wilson, C. Lawrence Zitnick, Zachary W. Ulissi:
Fine-Tuned Language Models Generate Stable Inorganic Materials as Text. ICLR 2024 - [c104]Yucen Lily Li, Tim G. J. Rudner, Andrew Gordon Wilson:
A Study of Bayesian Neural Network Surrogates for Bayesian Optimization. ICLR 2024 - [c103]Alan Nawzad Amin, Andrew Gordon Wilson:
Scalable and Flexible Causal Discovery with an Efficient Test for Adjacency. ICML 2024 - [c102]Micah Goldblum, Marc Anton Finzi, Keefer Rowan, Andrew Gordon Wilson:
Position: The No Free Lunch Theorem, Kolmogorov Complexity, and the Role of Inductive Biases in Machine Learning. ICML 2024 - [c101]Samuel Lavoie, Polina Kirichenko, Mark Ibrahim, Mido Assran, Andrew Gordon Wilson, Aaron C. Courville, Nicolas Ballas:
Modeling Caption Diversity in Contrastive Vision-Language Pretraining. ICML 2024 - [c100]Sanae Lotfi, Marc Anton Finzi, Yilun Kuang, Tim G. J. Rudner, Micah Goldblum, Andrew Gordon Wilson:
Non-Vacuous Generalization Bounds for Large Language Models. ICML 2024 - [c99]Theodore Papamarkou, Maria Skoularidou, Konstantina Palla, Laurence Aitchison, Julyan Arbel, David B. Dunson, Maurizio Filippone, Vincent Fortuin, Philipp Hennig, José Miguel Hernández-Lobato, Aliaksandr Hubin, Alexander Immer, Theofanis Karaletsos, Mohammad Emtiyaz Khan, Agustinus Kristiadi, Yingzhen Li, Stephan Mandt, Christopher Nemeth, Michael A. Osborne, Tim G. J. Rudner, David Rügamer, Yee Whye Teh, Max Welling, Andrew Gordon Wilson, Ruqi Zhang:
Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI. ICML 2024 - [c98]Hoang Phan, Andrew Gordon Wilson, Qi Lei:
Controllable Prompt Tuning For Balancing Group Distributional Robustness. ICML 2024 - [c97]Shikai Qiu, Boran Han, Danielle C. Maddix, Shuai Zhang, Bernie Wang, Andrew Gordon Wilson:
Transferring Knowledge From Large Foundation Models to Small Downstream Models. ICML 2024 - [c96]Shikai Qiu, Andres Potapczynski, Marc Anton Finzi, Micah Goldblum, Andrew Gordon Wilson:
Compute Better Spent: Replacing Dense Layers with Structured Matrices. ICML 2024 - [i115]Polina Kirichenko, Mark Ibrahim, Randall Balestriero, Diane Bouchacourt, Ramakrishna Vedantam, Hamed Firooz, Andrew Gordon Wilson:
Understanding the Detrimental Class-level Effects of Data Augmentation. CoRR abs/2401.01764 (2024) - [i114]Theodore Papamarkou, Maria Skoularidou, Konstantina Palla, Laurence Aitchison, Julyan Arbel, David B. Dunson, Maurizio Filippone, Vincent Fortuin, Philipp Hennig, José Miguel Hernández-Lobato, Aliaksandr Hubin, Alexander Immer, Theofanis Karaletsos, Mohammad Emtiyaz Khan, Agustinus Kristiadi, Yingzhen Li, Stephan Mandt, Christopher Nemeth, Michael A. Osborne, Tim G. J. Rudner, David Rügamer, Yee Whye Teh, Max Welling, Andrew Gordon Wilson, Ruqi Zhang:
Position Paper: Bayesian Deep Learning in the Age of Large-Scale AI. CoRR abs/2402.00809 (2024) - [i113]Nate Gruver, Anuroop Sriram, Andrea Madotto, Andrew Gordon Wilson, C. Lawrence Zitnick, Zachary W. Ulissi:
Fine-Tuned Language Models Generate Stable Inorganic Materials as Text. CoRR abs/2402.04379 (2024) - [i112]Hoang Phan, Andrew Gordon Wilson, Qi Lei:
Controllable Prompt Tuning For Balancing Group Distributional Robustness. CoRR abs/2403.02695 (2024) - [i111]Abdul Fatir Ansari, Lorenzo Stella, Ali Caner Türkmen, Xiyuan Zhang, Pedro Mercado, Huibin Shen, Oleksandr Shchur, Syama Sundar Rangapuram, Sebastian Pineda-Arango, Shubham Kapoor, Jasper Zschiegner, Danielle C. Maddix, Michael W. Mahoney, Kari Torkkola, Andrew Gordon Wilson, Michael Bohlke-Schneider, Yuyang Wang:
Chronos: Learning the Language of Time Series. CoRR abs/2403.07815 (2024) - [i110]Tim G. J. Rudner, Ya Shi Zhang, Andrew Gordon Wilson, Julia Kempe:
Mind the GAP: Improving Robustness to Subpopulation Shifts with Group-Aware Priors. CoRR abs/2403.09869 (2024) - [i109]Hossein Souri, Arpit Bansal, Hamid Kazemi, Liam Fowl, Aniruddha Saha, Jonas Geiping, Andrew Gordon Wilson, Rama Chellappa, Tom Goldstein, Micah Goldblum:
Generating Potent Poisons and Backdoors from Scratch with Guided Diffusion. CoRR abs/2403.16365 (2024) - [i108]Samuel Lavoie, Polina Kirichenko, Mark Ibrahim, Mahmoud Assran, Andrew Gordon Wilson, Aaron C. Courville, Nicolas Ballas:
Modeling Caption Diversity in Contrastive Vision-Language Pretraining. CoRR abs/2405.00740 (2024) - [i107]Shikai Qiu, Andres Potapczynski, Marc Finzi, Micah Goldblum, Andrew Gordon Wilson:
Compute Better Spent: Replacing Dense Layers with Structured Matrices. CoRR abs/2406.06248 (2024) - [i106]Shikai Qiu, Boran Han, Danielle C. Maddix, Shuai Zhang, Yuyang Wang, Andrew Gordon Wilson:
Transferring Knowledge from Large Foundation Models to Small Downstream Models. CoRR abs/2406.07337 (2024) - [i105]Sanyam Kapoor, Nate Gruver, Manley Roberts, Katherine M. Collins, Arka Pal, Umang Bhatt, Adrian Weller, Samuel Dooley, Micah Goldblum, Andrew Gordon Wilson:
Large Language Models Must Be Taught to Know What They Don't Know. CoRR abs/2406.08391 (2024) - [i104]Alan Nawzad Amin, Andrew Gordon Wilson:
Scalable and Flexible Causal Discovery with an Efficient Test for Adjacency. CoRR abs/2406.09177 (2024) - [i103]Ravid Shwartz-Ziv, Micah Goldblum, Arpit Bansal, C. Bayan Bruss, Yann LeCun, Andrew Gordon Wilson:
Just How Flexible are Neural Networks in Practice? CoRR abs/2406.11463 (2024) - [i102]Sanae Lotfi, Yilun Kuang, Brandon Amos, Micah Goldblum, Marc Finzi, Andrew Gordon Wilson:
Unlocking Tokens as Data Points for Generalization Bounds on Larger Language Models. CoRR abs/2407.18158 (2024) - [i101]Andres Potapczynski, Shikai Qiu, Marc Finzi, Christopher Ferri, Zixi Chen, Micah Goldblum, C. Bayan Bruss, Christopher De Sa, Andrew Gordon Wilson:
Searching for Efficient Linear Layers over a Continuous Space of Structured Matrices. CoRR abs/2410.02117 (2024) - 2023
- [c95]Samuel Stanton, Wesley J. Maddox, Andrew Gordon Wilson:
Bayesian Optimization with Conformal Prediction Sets. AISTATS 2023: 959-986 - [c94]Rami Aly, Xingjian Shi, Kaixiang Lin, Aston Zhang, Andrew Gordon Wilson:
Automated Few-Shot Classification with Instruction-Finetuned Language Models. EMNLP (Findings) 2023: 2414-2432 - [c93]Marc Anton Finzi, Andres Potapczynski, Matthew Choptuik, Andrew Gordon Wilson:
A Stable and Scalable Method for Solving Initial Value PDEs with Neural Networks. ICLR 2023 - [c92]Jonas Geiping, Micah Goldblum, Gowthami Somepalli, Ravid Shwartz-Ziv, Tom Goldstein, Andrew Gordon Wilson:
How Much Data Are Augmentations Worth? An Investigation into Scaling Laws, Invariance, and Implicit Regularization. ICLR 2023 - [c91]Nate Gruver, Marc Anton Finzi, Micah Goldblum, Andrew Gordon Wilson:
The Lie Derivative for Measuring Learned Equivariance. ICLR 2023 - [c90]Polina Kirichenko, Pavel Izmailov, Andrew Gordon Wilson:
Last Layer Re-Training is Sufficient for Robustness to Spurious Correlations. ICLR 2023 - [c89]Roman Levin, Valeriia Cherepanova, Avi Schwarzschild, Arpit Bansal, C. Bayan Bruss, Tom Goldstein, Andrew Gordon Wilson, Micah Goldblum:
Transfer Learning with Deep Tabular Models. ICLR 2023 - [c88]Zichang Liu, Zhiqiang Tang, Xingjian Shi, Aston Zhang, Mu Li, Anshumali Shrivastava, Andrew Gordon Wilson:
Learning Multimodal Data Augmentation in Feature Space. ICLR 2023 - [c87]Marc Anton Finzi, Anudhyan Boral, Andrew Gordon Wilson, Fei Sha, Leonardo Zepeda-Núñez:
User-defined Event Sampling and Uncertainty Quantification in Diffusion Models for Physical Dynamical Systems. ICML 2023: 10136-10152 - [c86]Shikai Qiu, Andres Potapczynski, Pavel Izmailov, Andrew Gordon Wilson:
Simple and Fast Group Robustness by Automatic Feature Reweighting. ICML 2023: 28448-28467 - [c85]Tim G. J. Rudner, Sanyam Kapoor, Shikai Qiu, Andrew Gordon Wilson:
Function-Space Regularization in Neural Networks: A Probabilistic Perspective. ICML 2023: 29275-29290 - [c84]Valeriia Cherepanova, Roman Levin, Gowthami Somepalli, Jonas Geiping, C. Bayan Bruss, Andrew Gordon Wilson, Tom Goldstein, Micah Goldblum:
A Performance-Driven Benchmark for Feature Selection in Tabular Deep Learning. NeurIPS 2023 - [c83]Micah Goldblum, Hossein Souri, Renkun Ni, Manli Shu, Viraj Prabhu, Gowthami Somepalli, Prithvijit Chattopadhyay, Mark Ibrahim, Adrien Bardes, Judy Hoffman, Rama Chellappa, Andrew Gordon Wilson, Tom Goldstein:
Battle of the Backbones: A Large-Scale Comparison of Pretrained Models across Computer Vision Tasks. NeurIPS 2023 - [c82]Nate Gruver, Marc Finzi, Shikai Qiu, Andrew Gordon Wilson:
Large Language Models Are Zero-Shot Time Series Forecasters. NeurIPS 2023 - [c81]Nate Gruver, Samuel Stanton, Nathan C. Frey, Tim G. J. Rudner, Isidro Hötzel, Julien Lafrance-Vanasse, Arvind Rajpal, Kyunghyun Cho, Andrew Gordon Wilson:
Protein Design with Guided Discrete Diffusion. NeurIPS 2023 - [c80]Polina Kirichenko, Mark Ibrahim, Randall Balestriero, Diane Bouchacourt, Shanmukha Ramakrishna Vedantam, Hamed Firooz, Andrew Gordon Wilson:
Understanding the detrimental class-level effects of data augmentation. NeurIPS 2023 - [c79]Andres Potapczynski, Marc Finzi, Geoff Pleiss, Andrew Gordon Wilson:
CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra. NeurIPS 2023 - [c78]Shikai Qiu, Tim G. J. Rudner, Sanyam Kapoor, Andrew Gordon Wilson:
Should We Learn Most Likely Functions or Parameters? NeurIPS 2023 - [c77]Ravid Shwartz-Ziv, Micah Goldblum, Yucen Lily Li, C. Bayan Bruss, Andrew Gordon Wilson:
Simplifying Neural Network Training Under Class Imbalance. NeurIPS 2023 - [c76]Ying Wang, Tim G. J. Rudner, Andrew Gordon Wilson:
Visual Explanations of Image-Text Representations via Multi-Modal Information Bottleneck Attribution. NeurIPS 2023 - [i100]Gianluca Detommaso, Alberto Gasparin, Michele Donini, Matthias W. Seeger, Andrew Gordon Wilson, Cédric Archambeau:
Fortuna: A Library for Uncertainty Quantification in Deep Learning. CoRR abs/2302.04019 (2023) - [i99]Micah Goldblum, Marc Finzi, Keefer Rowan, Andrew Gordon Wilson:
The No Free Lunch Theorem, Kolmogorov Complexity, and the Role of Inductive Biases in Machine Learning. CoRR abs/2304.05366 (2023) - [i98]Randall Balestriero, Mark Ibrahim, Vlad Sobal, Ari Morcos, Shashank Shekhar, Tom Goldstein, Florian Bordes, Adrien Bardes, Grégoire Mialon, Yuandong Tian, Avi Schwarzschild, Andrew Gordon Wilson, Jonas Geiping, Quentin Garrido, Pierre Fernandez, Amir Bar, Hamed Pirsiavash, Yann LeCun, Micah Goldblum:
A Cookbook of Self-Supervised Learning. CoRR abs/2304.12210 (2023) - [i97]Marc Finzi, Andres Potapczynski, Matthew Choptuik, Andrew Gordon Wilson:
A Stable and Scalable Method for Solving Initial Value PDEs with Neural Networks. CoRR abs/2304.14994 (2023) - [i96]Rami Aly, Xingjian Shi, Kaixiang Lin, Aston Zhang, Andrew Gordon Wilson:
Automated Few-shot Classification with Instruction-Finetuned Language Models. CoRR abs/2305.12576 (2023) - [i95]Nate Gruver, Samuel Stanton, Nathan C. Frey, Tim G. J. Rudner, Isidro Hötzel, Julien Lafrance-Vanasse, Arvind Rajpal, Kyunghyun Cho, Andrew Gordon Wilson:
Protein Design with Guided Discrete Diffusion. CoRR abs/2305.20009 (2023) - [i94]Yucen Lily Li, Tim G. J. Rudner, Andrew Gordon Wilson:
A Study of Bayesian Neural Network Surrogates for Bayesian Optimization. CoRR abs/2305.20028 (2023) - [i93]Marc Finzi, Anudhyan Boral, Andrew Gordon Wilson, Fei Sha, Leonardo Zepeda-Núñez:
User-defined Event Sampling and Uncertainty Quantification in Diffusion Models for Physical Dynamical Systems. CoRR abs/2306.07526 (2023) - [i92]Shikai Qiu, Andres Potapczynski, Pavel Izmailov, Andrew Gordon Wilson:
Simple and Fast Group Robustness by Automatic Feature Reweighting. CoRR abs/2306.11074 (2023) - [i91]Andres Potapczynski, Marc Finzi, Geoff Pleiss, Andrew Gordon Wilson:
CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra. CoRR abs/2309.03060 (2023) - [i90]Nate Gruver, Marc Finzi, Shikai Qiu, Andrew Gordon Wilson:
Large Language Models Are Zero-Shot Time Series Forecasters. CoRR abs/2310.07820 (2023) - [i89]Micah Goldblum, Hossein Souri, Renkun Ni, Manli Shu, Viraj Prabhu, Gowthami Somepalli, Prithvijit Chattopadhyay, Mark Ibrahim, Adrien Bardes, Judy Hoffman, Rama Chellappa, Andrew Gordon Wilson, Tom Goldstein:
Battle of the Backbones: A Large-Scale Comparison of Pretrained Models across Computer Vision Tasks. CoRR abs/2310.19909 (2023) - [i88]Valeriia Cherepanova, Roman Levin, Gowthami Somepalli, Jonas Geiping, C. Bayan Bruss, Andrew Gordon Wilson, Tom Goldstein, Micah Goldblum:
A Performance-Driven Benchmark for Feature Selection in Tabular Deep Learning. CoRR abs/2311.05877 (2023) - [i87]Shikai Qiu, Tim G. J. Rudner, Sanyam Kapoor, Andrew Gordon Wilson:
Should We Learn Most Likely Functions or Parameters? CoRR abs/2311.15990 (2023) - [i86]Ravid Shwartz-Ziv, Micah Goldblum, Yucen Lily Li, C. Bayan Bruss, Andrew Gordon Wilson:
Simplifying Neural Network Training Under Class Imbalance. CoRR abs/2312.02517 (2023) - [i85]Yanjun Liu, Milena Jovanovic, Krishnanand Mallayya, Wesley J. Maddox, Andrew Gordon Wilson, Sebastian Klemenz, Leslie M. Schoop, Eun-Ah Kim:
Materials Expert-Artificial Intelligence for Materials Discovery. CoRR abs/2312.02796 (2023) - [i84]Micah Goldblum, Anima Anandkumar, Richard G. Baraniuk, Tom Goldstein, Kyunghyun Cho, Zachary C. Lipton, Melanie Mitchell, Preetum Nakkiran, Max Welling, Andrew Gordon Wilson:
Perspectives on the State and Future of Deep Learning - 2023. CoRR abs/2312.09323 (2023) - [i83]Tim G. J. Rudner, Sanyam Kapoor, Shikai Qiu, Andrew Gordon Wilson:
Function-Space Regularization in Neural Networks: A Probabilistic Perspective. CoRR abs/2312.17162 (2023) - [i82]Sanae Lotfi, Marc Finzi, Yilun Kuang, Tim G. J. Rudner, Micah Goldblum, Andrew Gordon Wilson:
Non-Vacuous Generalization Bounds for Large Language Models. CoRR abs/2312.17173 (2023) - [i81]Ying Wang, Tim G. J. Rudner, Andrew Gordon Wilson:
Visual Explanations of Image-Text Representations via Multi-Modal Information Bottleneck Attribution. CoRR abs/2312.17174 (2023) - 2022
- [c75]Nate Gruver, Marc Anton Finzi, Samuel Don Stanton, Andrew Gordon Wilson:
Deconstructing the Inductive Biases of Hamiltonian Neural Networks. ICLR 2022 - [c74]Gregory W. Benton, Wesley J. Maddox, Andrew Gordon Wilson:
Volatility Based Kernels and Moving Average Means for Accurate Forecasting with Gaussian Processes. ICML 2022: 1798-1816 - [c73]Sanae Lotfi, Pavel Izmailov, Gregory W. Benton, Micah Goldblum, Andrew Gordon Wilson:
Bayesian Model Selection, the Marginal Likelihood, and Generalization. ICML 2022: 14223-14247 - [c72]Samuel Stanton, Wesley J. Maddox, Nate Gruver, Phillip M. Maffettone, Emily Delaney, Peyton Greenside, Andrew Gordon Wilson:
Accelerating Bayesian Optimization for Biological Sequence Design with Denoising Autoencoders. ICML 2022: 20459-20478 - [c71]Ruqi Zhang, Andrew Gordon Wilson, Christopher De Sa:
Low-Precision Stochastic Gradient Langevin Dynamics. ICML 2022: 26624-26644 - [c70]Pavel Izmailov, Polina Kirichenko, Nate Gruver, Andrew Gordon Wilson:
On Feature Learning in the Presence of Spurious Correlations. NeurIPS 2022 - [c69]Sanyam Kapoor, Wesley J. Maddox, Pavel Izmailov, Andrew Gordon Wilson:
On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification. NeurIPS 2022 - [c68]Sanae Lotfi, Marc Finzi, Sanyam Kapoor, Andres Potapczynski, Micah Goldblum, Andrew Gordon Wilson:
PAC-Bayes Compression Bounds So Tight That They Can Explain Generalization. NeurIPS 2022 - [c67]Ravid Shwartz-Ziv, Micah Goldblum, Hossein Souri, Sanyam Kapoor, Chen Zhu, Yann LeCun, Andrew Gordon Wilson:
Pre-Train Your Loss: Easy Bayesian Transfer Learning with Informative Priors. NeurIPS 2022 - [c66]Wanqian Yang, Polina Kirichenko, Micah Goldblum, Andrew Gordon Wilson:
Chroma-VAE: Mitigating Shortcut Learning with Generative Classifiers. NeurIPS 2022 - [c65]Wesley J. Maddox, Andres Potapczynski, Andrew Gordon Wilson:
Low-precision arithmetic for fast Gaussian processes. UAI 2022: 1306-1316 - [i80]Nate Gruver, Marc Finzi, Samuel Stanton, Andrew Gordon Wilson:
Deconstructing the Inductive Biases of Hamiltonian Neural Networks. CoRR abs/2202.04836 (2022) - [i79]Sanae Lotfi, Pavel Izmailov, Gregory W. Benton, Micah Goldblum, Andrew Gordon Wilson:
Bayesian Model Selection, the Marginal Likelihood, and Generalization. CoRR abs/2202.11678 (2022) - [i78]Samuel Stanton, Wesley J. Maddox, Nate Gruver, Phillip M. Maffettone, Emily Delaney, Peyton Greenside, Andrew Gordon Wilson:
Accelerating Bayesian Optimization for Biological Sequence Design with Denoising Autoencoders. CoRR abs/2203.12742 (2022) - [i77]Sanyam Kapoor, Wesley J. Maddox, Pavel Izmailov, Andrew Gordon Wilson:
On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification. CoRR abs/2203.16481 (2022) - [i76]Polina Kirichenko, Pavel Izmailov, Andrew Gordon Wilson:
Last Layer Re-Training is Sufficient for Robustness to Spurious Correlations. CoRR abs/2204.02937 (2022) - [i75]Ravid Shwartz-Ziv, Micah Goldblum, Hossein Souri, Sanyam Kapoor, Chen Zhu, Yann LeCun, Andrew Gordon Wilson:
Pre-Train Your Loss: Easy Bayesian Transfer Learning with Informative Priors. CoRR abs/2205.10279 (2022) - [i74]Ruqi Zhang, Andrew Gordon Wilson, Christopher De Sa:
Low-Precision Stochastic Gradient Langevin Dynamics. CoRR abs/2206.09909 (2022) - [i73]Roman Levin, Valeriia Cherepanova, Avi Schwarzschild, Arpit Bansal, C. Bayan Bruss, Tom Goldstein, Andrew Gordon Wilson, Micah Goldblum:
Transfer Learning with Deep Tabular Models. CoRR abs/2206.15306 (2022) - [i72]Gregory W. Benton, Wesley J. Maddox, Andrew Gordon Wilson:
Volatility Based Kernels and Moving Average Means for Accurate Forecasting with Gaussian Processes. CoRR abs/2207.06544 (2022) - [i71]Wesley J. Maddox, Andres Potapczynski, Andrew Gordon Wilson:
Low-Precision Arithmetic for Fast Gaussian Processes. CoRR abs/2207.06856 (2022) - [i70]Nate Gruver, Marc Finzi, Micah Goldblum, Andrew Gordon Wilson:
The Lie Derivative for Measuring Learned Equivariance. CoRR abs/2210.02984 (2022) - [i69]Jonas Geiping, Micah Goldblum, Gowthami Somepalli, Ravid Shwartz-Ziv, Tom Goldstein, Andrew Gordon Wilson:
How Much Data Are Augmentations Worth? An Investigation into Scaling Laws, Invariance, and Implicit Regularization. CoRR abs/2210.06441 (2022) - [i68]Pavel Izmailov, Polina Kirichenko, Nate Gruver, Andrew Gordon Wilson:
On Feature Learning in the Presence of Spurious Correlations. CoRR abs/2210.11369 (2022) - [i67]Samuel Stanton, Wesley J. Maddox, Andrew Gordon Wilson:
Bayesian Optimization with Conformal Coverage Guarantees. CoRR abs/2210.12496 (2022) - [i66]Renkun Ni, Ping-yeh Chiang, Jonas Geiping, Micah Goldblum, Andrew Gordon Wilson, Tom Goldstein:
K-SAM: Sharpness-Aware Minimization at the Speed of SGD. CoRR abs/2210.12864 (2022) - [i65]Sanae Lotfi, Marc Finzi, Sanyam Kapoor, Andres Potapczynski, Micah Goldblum, Andrew Gordon Wilson:
PAC-Bayes Compression Bounds So Tight That They Can Explain Generalization. CoRR abs/2211.13609 (2022) - [i64]Wanqian Yang, Polina Kirichenko, Micah Goldblum, Andrew Gordon Wilson:
Chroma-VAE: Mitigating Shortcut Learning with Generative Classifiers. CoRR abs/2211.15231 (2022) - [i63]Amin Ghiasi, Hamid Kazemi, Eitan Borgnia, Steven Reich, Manli Shu, Micah Goldblum, Andrew Gordon Wilson, Tom Goldstein:
What do Vision Transformers Learn? A Visual Exploration. CoRR abs/2212.06727 (2022) - [i62]Zichang Liu, Zhiqiang Tang, Xingjian Shi, Aston Zhang, Mu Li, Anshumali Shrivastava, Andrew Gordon Wilson:
Learning Multimodal Data Augmentation in Feature Space. CoRR abs/2212.14453 (2022) - 2021
- [c64]Wesley J. Maddox, Shuai Tang, Pablo Garcia Moreno, Andrew Gordon Wilson, Andreas C. Damianou:
Fast Adaptation with Linearized Neural Networks. AISTATS 2021: 2737-2745 - [c63]Samuel Stanton, Wesley J. Maddox, Ian A. Delbridge, Andrew Gordon Wilson:
Kernel Interpolation for Scalable Online Gaussian Processes. AISTATS 2021: 3133-3141 - [c62]Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson:
Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling. ICML 2021: 769-779 - [c61]Marc Finzi, Max Welling, Andrew Gordon Wilson:
A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups. ICML 2021: 3318-3328 - [c60]Pavel Izmailov, Sharad Vikram, Matthew D. Hoffman, Andrew Gordon Wilson:
What Are Bayesian Neural Network Posteriors Really Like? ICML 2021: 4629-4640 - [c59]Sanyam Kapoor, Marc Finzi, Ke Alexander Wang, Andrew Gordon Wilson:
SKIing on Simplices: Kernel Interpolation on the Permutohedral Lattice for Scalable Gaussian Processes. ICML 2021: 5279-5289 - [c58]Shengyang Sun, Jiaxin Shi, Andrew Gordon Wilson, Roger B. Grosse:
Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition. ICML 2021: 9955-9965 - [c57]Brandon Amos, Samuel Stanton, Denis Yarats, Andrew Gordon Wilson:
On the model-based stochastic value gradient for continuous reinforcement learning. L4DC 2021: 6-20 - [c56]Andrew Gordon Wilson, Pavel Izmailov, Matthew D. Hoffman, Yarin Gal, Yingzhen Li, Melanie F. Pradier, Sharad Vikram, Andrew Y. K. Foong, Sanae Lotfi, Sebastian Farquhar:
Evaluating Approximate Inference in Bayesian Deep Learning. NeurIPS (Competition and Demos) 2021: 113-124 - [c55]Pavel Izmailov, Patrick Nicholson, Sanae Lotfi, Andrew Gordon Wilson:
Dangers of Bayesian Model Averaging under Covariate Shift. NeurIPS 2021: 3309-3322 - [c54]Wesley J. Maddox, Samuel Stanton, Andrew Gordon Wilson:
Conditioning Sparse Variational Gaussian Processes for Online Decision-making. NeurIPS 2021: 6365-6379 - [c53]Samuel Stanton, Pavel Izmailov, Polina Kirichenko, Alexander A. Alemi, Andrew Gordon Wilson:
Does Knowledge Distillation Really Work? NeurIPS 2021: 6906-6919 - [c52]Wesley J. Maddox, Maximilian Balandat, Andrew Gordon Wilson, Eytan Bakshy:
Bayesian Optimization with High-Dimensional Outputs. NeurIPS 2021: 19274-19287 - [c51]Marc Finzi, Greg Benton, Andrew Gordon Wilson:
Residual Pathway Priors for Soft Equivariance Constraints. NeurIPS 2021: 30037-30049 - [i61]Gregory W. Benton, Wesley J. Maddox, Sanae Lotfi, Andrew Gordon Wilson:
Loss Surface Simplexes for Mode Connecting Volumes and Fast Ensembling. CoRR abs/2102.13042 (2021) - [i60]Wesley J. Maddox, Shuai Tang, Pablo Garcia Moreno, Andrew Gordon Wilson, Andreas C. Damianou:
Fast Adaptation with Linearized Neural Networks. CoRR abs/2103.01439 (2021) - [i59]Samuel Stanton, Wesley J. Maddox, Ian A. Delbridge, Andrew Gordon Wilson:
Kernel Interpolation for Scalable Online Gaussian Processes. CoRR abs/2103.01454 (2021) - [i58]