Selected Publications
Denoising Diffusion Models
-
W. Deng, W. Luo, Y. Tan, M. Biloš, Yu, Yuriy, T. Q. Chen. Variational Schrödinger Diffusion Models. ICML 2024.
-
W. Deng, Y. Chen, T. Yang, H. Du, Q. Feng, T. Q. Chen. Reflected Schrödinger Bridge for Constrained Generative Modeling. UAI 2024 (Oral) Acceptance rate 3.8%.
-
Y. Chenα, W. Dengα#, S. Fangα, F. Liα, T. Yang, Y. Zhang, K. Rasul, S. Zhe, A. Schneider, Y. Nevmyvaka. Provably Convergent Schrödinger Bridge with Applications to Probabilistic Time Series Imputation. ICML 2023 (α: alphabetical order, #: Correspondence)
Monte Carlo Methods
-
H. Zheng, H. Du, Q. Feng, W. Deng#, G. Lin. Constrained Exploration via Reflected Replica Exchange Stochastic Gradient Langevin Dynamics. ICML 2024.
-
J. Liang, Q. Zhang, W. Deng, Q. Song, G. Lin. Bayesian Federated Learning with Hamiltonian Monte Carlo: Algorithm and Theory. Journal of Computational and Graphical Statistics. 2024
-
W. Deng, Q. Zhang, Y.-A. Ma, Z. Song, G. Lin. On Convergence of Federated Averaging Langevin Dynamics. UAI 2024
-
W. Deng, Q. Zhang, Q. Feng, F. Liang, G. Lin. Non-reversible Parallel Tempering for Deep Posterior Approximation. AAAI-23 (Oral)
-
W. Deng, G. Lin, F. Liang. An Adaptively Weighted Stochastic Gradient MCMC Algorithm for Monte Carlo simulation and Global Optimization. Statistics and Computing, (2022) 32:58 [code]
-
W. Deng, S. Liang, B. Hao, G. Lin, F. Liang. Interacting Contour Stochastic Gradient Langevin Dynamics. ICLR 2022 [code] [video]
-
W. Deng*, Q. Feng, G. Karagiannis, G. Lin, F. Liang. Accelerating Convergence of Replica Exchange Stochastic Gradient MCMC via Variance Reduction. ICLR 2021. [code] [video]
-
W. Deng, Q. Feng, L. Gao, F. Liang, G. Lin. Non-convex Learning via Replica Exchange Stochastic Gradient MCMC. ICML 2020. [code] [slides]
-
W. Deng, G. Lin, F. Liang. A Contour Stochastic Gradient Langevin Dynamics Algorithm for Simulations of Multi-modal Distributions. NeurIPS 2020 [code] [blog] [slides] [poster][video] [知乎]
Thompson Sampling
-
B. Hao, T. Lattimore, W. Deng. Information Directed Sampling for Sparse Linear Bandits. NeurIPS 2021 Spotlight (3% acceptance rate)
-
H Zheng, W Deng, C Moya, G Lin. Accelerating Approximate Thompson Sampling with Underdamped Langevin Monte Carlo. AISTAT 2024
Sparse Deep Learning and Applications
-
W. Deng, X. Zhang, F. Liang, G. Lin. An Adaptive Empirical Bayesian Method for Sparse Deep Learning. NeurIPS 2019 [code]
-
W. Deng, J. Pan, T. Zhou, D. Kong, A. Flores, G. Lin. DeepLight: Deep Lightweight Feature Interactions for Accelerating CTR Predictions in Ad Serving. WSDM 2021 [code]
-
Y. Wang, W. Deng, G. Lin. An Adaptive Hessian Approximated Stochastic Gradient MCMC Method. Journal of Computational Physics. Vol 432, 2021.
(*): equal contribution. (α): alphabetical order. (#): correspondence