Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 2 
Published in Mathematical Programming, 2021
Novel convex-optimization bounds for the maximum-entropy sampling problem.
Recommended citation: Zhongzhu Chen, Marcia Fampa, Amélie Lambert, and Jon Lee. (2021). "Mixing Convex-Optimization Bounds for Maximum-Entropy Sampling." Mathematical Programming.
Download Paper
Published in Operations Research, 2022
Improved entropy bounds through masking techniques on existing relaxations.
Recommended citation: Zhongzhu Chen, Marcia Fampa, and Jon Lee. (2022). Technical Note - Masking Anstreicher Linx Bound for Improved Entropy Bounds. Operations Research.
Download Paper
Published in The Eleventh International Conference on Learning Representations (ICLR 2023), 2023
State-of-the-art diffusion-based defense against adversarial attacks achieving certified robustness.
Recommended citation: Chaowei Xiao*, Zhongzhu Chen*, Kun Jin*, Jiongxiao Wang*, Weili Nie, Mingyan Liu, Anima Anandkumar, Bo Li, and Dawn Song. (2023). "DensePure: Understanding Diffusion Models for Adversarial Robustness." ICLR 2023.
Download Paper
Published in INFORMS Journal on Computing, 35(2), 2023: 368-385, 2023
Computational study of convex relaxations for maximum-entropy sampling.
Recommended citation: Zhongzhu Chen, Marcia Fampa, and Jon Lee. (2023). "On Computing with Some Convex Relaxations for the Maximum-Entropy Sampling Problem." INFORMS Journal on Computing, 35(2), 368-385.
Download Paper
Published in 32nd USENIX Security Symposium (USENIX Security 23), 2023
Combining diffusion models with local smoothing for certified robustness guarantees.
Recommended citation: Jiawei Zhang*, Zhongzhu Chen*, Huan Zhang, Chaowei Xiao, and Bo Li. (2023). "DiffSmooth: Certifiably Robust Learning via Diffusion Models and Local Smoothing." USENIX Security 23, pp. 4787-4804.
Download Paper
Published in Mathematical Programming, 2024
Generalized scaling framework achieving 10× speedup over prior methods.
Recommended citation: Zhongzhu Chen, Marcia Fampa, and Jon Lee. (2024). "Generalized Scaling for the Constrained Maximum-Entropy Sampling Problem." Mathematical Programming.
Download Paper
Published in Proceedings of the AAAI Conference on Artificial Intelligence, 38 (AAAI 2024), 2024
Addressing model-dependent distribution shifts in federated learning with performative framework.
Recommended citation: Kun Jin*, Tongxin Yin*, Zhongzhu Chen*, Zeyu Sun, Xueru Zhang, Yang Liu, Mingyan Liu. (2024). "Performative Federated Learning: A Solution to Model-Dependent and Heterogeneous Distribution Shift." AAAI 2024.
Download Paper
Published in Ph.D. Dissertation, University of Michigan - Ann Arbor, 2024
Ph.D. dissertation on algorithmic advances for maximum-entropy sampling.
Recommended citation: Zhongzhu Chen. (2024). "On Algorithmic Advances for Maximum-Entropy Sampling." Ph.D. Dissertation, University of Michigan - Ann Arbor.
Download Paper
Published in The Thirty-Eighth Annual Conference on Neural Information Processing Systems (NeurIPS 2024), 2024
Efficient diffusion purification method achieving both effectiveness and efficiency in certified robustness.
Recommended citation: Yiquan Li*, Zhongzhu Chen*, Kun Jin*, Jiongxiao Wang*, Jiachen Lei, Bo Li, Chaowei Xiao. (2024). "Consistency Purification: Effective and Efficient Diffusion Purification towards Certified Robustness." NeurIPS 2024.
Download Paper
Published in The Thirteenth International Conference on Learning Representations (ICLR 2025), 2025
Novel approach combining contrastive learning with denoising for robust representations.
Recommended citation: Jiachen Lei, Julius Berner, Jiongxiao Wang, Zhongzhu Chen, Zhongjia Ba, Kui Ren, Jun Zhu, Anima Anandkumar. (2025). "Robust Representation Consistency Model via Contrastive Denoising." ICLR 2025.
Download Paper
Published:
This is a description of your talk, which is a markdown file that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.