Publications AITRICS' innovative research takes the lead in advancements in medical artificial intelligence. All AAAI ACL ACS Acute and Critical Care AISTATS arXiv BMJ Health & Care Informatics CHIL Computer Vision&Image Understanding Critical Care CVPR ECCV EMNLP ICASSP ICCV ICLR ICML IEEE IJCAI INTERSPEECH JCDD JMIR Journal Clinical Medicine MLHC NAACL NeurIPS SaTML Scientific Reports Sensors Title Content Search 83 ICLR Meta-GMVAE: Mixture of Gaussian VAE for Unsupervised Meta-Learning ICLR 2021 Meta-GMVAE: Mixture of Gaussian VAE for Unsupervised Meta-Learning Dong Bok Lee, Dongchan Min, Seanie Lee and Sung Ju Hwang Unsupervised learning aims to learn meaningful representations from unlabeled data which can captures its intrinsic structure, that can be transf... 82 ICLR Learning to Sample with Local and Global Contexts in Experience Replay Buffers ICLR 2021 Learning to Sample with Local and Global Contexts in Experience Replay Buffers Youngmin Oh, Kimin Lee, Jinwoo Shin, Eunho Yang, Sung Ju Hwang Experience replay, which enables the agents to remember and reuse experience from the past, has played a significant role in th... 81 ACL Learning to Perturb Word Embeddings for Out-of-distribution QA ACL 2021 Learning to Perturb Word Embeddings for Out-of-distribution QA Seanie Lee, Minki Kang, Juho Lee and Sung Ju Hwang QA models based on pretrained language models have achieved remarkable performance on various benchmark datasets. However, QA models do not generalize well ... 80 ICML Learning to Generate Noise for Multi-Attack Robustness ICML 2021 Learning to Generate Noise for Multi-Attack Robustness Divyam Madaan, Jinwoo Shin and Sung Ju Hwang Adversarial learning has emerged as one of the successful techniques to circumvent the susceptibility of existing methods against adversarial perturbations. However, the... 79 IEEE Learning How Long to Wait: Adaptively-Constrained Monotonic Multihead Attention for Streaming ASR IEEE 2021 Learning How Long to Wait: Adaptively-Constrained Monotonic Multihead Attention for Streaming ASR Jaeyun Song; Hajin Shim; Eunho Yang Monotonic Multihead Attention, which allows multiple heads to learn their own alignments per head, shows great performance on simultane... 78 ICML Large-Scale Meta-Learning with Continual Trajectory Shifting ICML 2021 Large-Scale Meta-Learning with Continual Trajectory Shifting JaeWoong Shin, Hae Beom Lee, Boqing Gong and Sung Ju Hwang Meta-learning of shared initialization parameters has shown to be highly effective in solving few-shot learning tasks. However, extending the framewo... 77 NeurIPS Hardware-Adaptive Efficient Latency Predictor for NAS via Meta-Learning NeurIPS 2021 Hardware-Adaptive Efficient Latency Predictor for NAS via Meta-Learning Hayeon Lee, Sewoong Lee, Song Chong and Sung Ju Hwang For deployment, neural architecture search should be hardware-aware, in order to satisfy the device-specific constraints (e.g., memory usage... 76 AAAI GTA: Graph Truncated Attention for Retrosynthesis AAAI 2021 GTA: Graph Truncated Attention for Retrosynthesis Seung-Woo Seo, You Young Song, June Yong Yang, Seohui Bae, Hankook Lee, Jinwoo Shin, Sung Ju Hwang, and Eunho Yang Retrosynthesis is the task of predicting reactant molecules from a given product molecule and is, import... 75 NeurIPS Hit and Lead Discovery with Explorative RL and Fragment-based Molecule Generation NeurIPS 2021 Hit and Lead Discovery with Explorative RL and Fragment-based Molecule Generation Soojung Yang, Doyeong Hwang, Seul Lee, Seongok Ryu and Sung Ju Hwang Recently, utilizing reinforcement learning (RL) to generate molecules with desired properties has been highlighted ... 74 ICLR FedMix: Approximation of Mixup under Mean Augmented Federated Learning ICLR 2021 FedMix: Approximation of Mixup under Mean Augmented Federated Learning Tehrim Yoon, Sumin Shin, Sung Ju Hwang and Eunho Yang Federated learning (FL) allows edge devices to collectively learn a model without directly sharing data within each device, thus preserving priv... 11 12 13 14 15