Welcome to my humble abode!!

Me at Ahu Tongariki, Easter Island

At Ahu Tongariki, Easter Island

News & Updates


Who I am Education Professional Experience Publication Curriculum Vitae Experiments

Who I am


Education


Professional Experience

Research and development at Knowledge Mining Research Team, Electronics and Telecommunications Research Institute (ETRI), South Korea, February 2010 - April 2014

My first task was to build a Hadoop and HBase cluster to process Big Data (news, blogs, Tweets). Then I taught myself to write MapReduce codes for text analysis. This changed the way my team process text data, from dealing several tens of thousand documents on a few separate machines to systematically analyzing millions of documents per day on a cluster and saving the results to a distributed database. Then I worked on Named Entity Recognition module and Event Extraction module, while simultaneously studying machine learning techniques. I also developed solvers for Binary SVM, Structural SVM, One-class SVM, Ranking SVM in C++ and Java.

Project Participation Internship at Research, Development and Dissemination (RD&D), Sutter Health, California, May 2015 - August 2015

During the internship, I explored the potential of applying deep learning methods to health care problems, specifically predicting the future heart failure diagnosis. Applying stacked de-noising auto encoders to heart failure prediction enabled sophisticated analysis of the relation between patient features and heart failure diagnosis. Furthermore, through the combination of the word embedding technique and recurrent neural networks, I was able to improve the heart failure prediction performance from 0.81 AUC to 0.86 AUC. This work was published in JAMIA.

Internship at Research, Development and Dissemination (RD&D), Sutter Health, California, May 2016 - August 2016

In my second internship at Sutter Health, I focused on developing interpretable deep learning models for predictive healthcare. Specifically, using the neural attention mechanism combined with RNN and MLP, I was able to design a sequence prediction model that demonstrated similar AUC as RNN but completely interpretable; the model allowed precise calculation of how much each diagnosis/medication/procedure in the past visits contributed to the final prediction. This work, named RETAIN, was presented in NIPS 2016.

Research Internship at DeepMind, London, U.K., Feb 2017 - May 2017

My first project was to train an embodied agent to find out the heaviest object in a virtual environment. This was an extended work of "Which is heavier?" experiment from Learning to Perform Physics Experiments via Deep Reinforcement Learning (Denil et al. ICLR 2017). The agent was equipped with a hammer to probe the objects, and a positive reward was given when the hammer was in contact with the heaviest object (hence the project name Pinata). The agent successfully learned to interact with the objects and stick to the heaviest one (example video 1, example video 2).
My second project was training neural agents to develop compositional language purely from raw pixels by playing an image description game. By employing a communication strategy named Obverter, which is motivated by the theory of mind, we confirmed the two agents could develop a highly structured/patterned communication protocol. This work will be presented in ICLR 2018.

Research Internship at Google Research, Mountain View, California, May 2017 - Aug 2017

I was a member of the project team named MorphNet. The objective was to automatically learn the structure of neural networks given some resource constraint (e.g. number of parameters, number of FLOPs), using various regularization methods. My specific task was related to NLP applications and I modified several variable selection algorithms such as Smoothly Clipped Absolute Deviation (SCAD) to activate/deactivate groups of parameters.


Publication

Conference Proceedings & Journal Articles

Workshops & Preprints


Curriculum Vitae

Here

Experiments

DeepMind internship project: Pinata - agent gets a reward if it touches the heaviest object

This is an extended work of "Which is heavier?" experiment from Learning to Perform Physics Experiments via Deep Reinforcement Learning (Denil et al. ICLR 2017).
Object densities are shown at the bottom right.

Agent trained with 15-second episodes. Agent trained with 50-second episodes
(time ticks 3 times faster, hence the video length 16 seconds)

GRAM Visualization

2D plot of disease representations learned from domain knowledge, initialized with GloVe
2D plot of disease representations learned from domain knowledge
2D plot of disease representations learned from fake domain knowledge
2D plot of disease representations learned by GRU, initialized with GloVe vectors
2D plot of disease representations learned by GRU, randomly initialized
2D plot of disease representations learned by GloVe
2D plot of disease representations learned by Skip-gram

Healthcare Concept Representation Learned by Med2Vec

codeEmb.npy: Embedding matrix of medical concepts (Python Numpy).
int2str.p: Mapping between integer code to string code (Python dictionary).
str2desc.p: Mapping between string code to descriptions (Python dictionary).

The embedding matrix codeEmb.npy has the shape 27523 by 200. Each row is a specific medical concept (diagnosis code, medication code or procedure code) represented by a 200 dimensional vector. int2str.p is a Python dictionary that maps the dimension number of the embedding matrix to the string code of the medical concept. For example the first dimension of codeEmb.npy can be mapped to a string code "D_401.9". The first letter of the string code could be D, R, or P which respectively stand for diagnosis, medication and procedure. str2desc.p is a Python dictionary that maps the string code to the actual description of the medical concept. For example, the string code "D_401.9" is mapped to the string description "Unspecified essential hypertension".

Healthcare Concept Representation Learning Visualization

2D plot of disease representations learned from non-negative Skip-gram

Disease network analysis from ICDM 2015

Disease network constructed from MIMIC II dataset

Last Modified: Dec. 12, 2017