Youngseog Chung

I am a Ph.D. student in the Machine Learning Department, within the School of Computer Science at Carnegie Mellon University (CMU). I am advised by Jeff Schneider.

Email  /  CV  /  Google Scholar  /  Twitter  /  Github

profile photo
Research Interests

My current research focus is in methods for reasoning about uncertainty from data and their applications in decision making and control. I am interested in understanding why models make wrong predictions with high-confidence, and developing methods to utilize predictive uncertainty for better decision making. This span topics in uncertainty quantification, Bayesian machine learning, probabilistic learning and inference, decision making under uncertainty, and reinforcement learning.

Software/Projects
Uncertainty Toolbox
Developed by: Youngseog Chung, Willie Neiswanger Ian Char, Han Guo

Uncertainty Toolbox is a python toolbox for evaluating and visualizing predictive uncertainty quantification. It includes a suite of evaluation metrics (accuracy, average calibration, adversarial group calibration, sharpness, proper scoring rules), plots to visualize the confidence bands, prediction intervals, and calibration. recalibration function. It also includes a glossary and a list of relevant papers in uncertainty quantification.

Publications and Preprints
Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty Quantification
Youngseog Chung, Willie Neiswanger Ian Char, Jeff Schneider
NeurIPS, 2021

We propose two algorithms to learn the conditional quantiles from data for predictive uncertainty quantification. One algorithm utilizes consistent estimators of the conditional densities. For the second algorithm, we propose a loss function to directly optimize calibration and sharpness.

Neural Dynamical Systems: Balancing Structure and Flexibility in Physical Prediction
Viraj Mehta, Ian Char, Willie Neiswanger, Youngseog Chung, Andrew Oakleigh Nelson, Mark D Boyer, Egemen Kolemen, Willie Neiswanger, Jeff Schneider
IEEE Conference on Decision and Control (CDC), 2021

We introduce an algorithm for modeling dynamical systems which utilizes neural ordinary differential equations (ODE). By utilizing ODE's, we empirically show significant improvement in sample efficiency and parameter shift when learning the dynamics model from data.

Offline Contextual Bayesian Optimization
Ian Char, Youngseog Chung, Willie Neiswanger, Kirthevasan Kandasamy, Andrew Oakleigh Nelson, Mark D Boyer, Egemen Kolemen, Jeff Schneider
NeurIPS, 2019

We propose a contextual Bayesian optimization based on Thompson sampling in the offline setting. The offline setting assumes that the user can actively choose contexts to query, as opposed to the online setting where contexts are chosen by nature.

Post-nonlinear Causal Model with Deep Neural Networks
Youngseog Chung, Joon Kim, Tom Yan, Helen Zhou,
Preprint, 2019

We propose an end-to-end learning procedure with deep neural networks for causal discovery. The algorithm is designed to identify the causal directions between multiple variables which have a post-nonlinear causal relationship.

Workshop Papers
Neural Dynamical Systems
Viraj Mehta, Ian Char, Willie Neiswanger, Youngseog Chung, Andrew Oakleigh Nelson, Mark D Boyer, Egemen Kolemen, Willie Neiswanger, Jeff Schneider
ICLR 2020 Integration of Deep Neural Models and Differential Equations Workshop, 2020

Offline Contextual Bayesian Optimization for Nuclear Fusion
Youngseog Chung*, Ian Char*, Willie Neiswanger, Kirthevasan Kandasamy, Andrew Oakleigh Nelson, Mark D Boyer, Egemen Kolemen, Jeff Schneider
NeurIPS 2019 Workshop on Machine Learning and the Physical Sciences, 2019

Website template taken from here