Let's reflect on the latest scientific research, challenge the issues facing
global communities, and use technology to empower expansive reformation.
I'm an undergrad at the University of
Computer Science, Statistics and Math and
Currently, I research neural SDEs, Bayesian neural networks, and model-based
reinforcement learning with David
at the Vector Institute.
I'm also building bigger and more data efficient neural networks with
brilliant researchers at FOR.ai.
For the next academic year, I will be on
first building sparse neural language models with Lukasz Kaiser at
, then tackling sim2real transfer in RL and robotics at NVIDIA AI
, and finally bringing large-scale models of human-generated content
to reality at Secant.ai.
In the past, I was integrating the best of cloud technology in remote
build toolchains at Google Cloud and
disrupting the next generation of carpooling platforms at
My research interests lie in reinforcement learning, latent
models, and (Bayesian) neural networks..
Earlier on in my life, I was a biology student who engaged with research
ventures mostly in
biotechnology and tissue engineering like
From exploring genetic editing and basic protein biology, to now
developing more robust deep learning systems and network representation
much of my research is about inferring the physical world, exploring the
anomalies of dynamical systems, and generating insights underpinning
We validated the utility of two
next-generation-sequencing assays (ChIP-Exo and ChIP-Nexus) in
exploration of cancer transcriptomes. I implemented
custom bioinformatics pipelines, unsupervised ML algorithms (Segway),
As part of the 2019 ICLR Reproducibility
Challenge, we implemented this
(later accepted) that
investigated a novel improvement to Equilibrium
Propagation, which is a method of energy based models (EBMs).
We elucidate the structural role of Podocalyxin,
inspired by the biological signifiance of podocytes in the kidney, now
in maintaining blood brain barrier
integrity under septic shock. Methodology included live animal models,
in vivo cell culture, and electron microscopy.
Research I completed back in high school.
This is how I've learned. I'm an inquisitive individual who always loves
From data mining to open source web development, entrepreneurship to
basic science, I am grateful to have
spent my time gaining and practising new skills to make a positive
impact on myself and others.
With recent headlines showing multiple fatalities from vaping, we sought
a solution that can mitigate nicotine addiction in Juul users. We
the Juul to function with our Gaussian Process prediction model that can
patterns and usage frequency of users, then dynamically adjust the
accordingly. What differentiates our product is the implementation of
where the percentage decrease is customized from user feedback.
Currently in conversation with the Centre for Addiction and Mental
Health to use this device
DOC is a tool for doctors to receive a second opinion on the diagnosis
of medical conditions directly through patient
interaction in real time through the analysis of the symptoms and
conversations they have with their patients.
Subsequently, DOC streamlines the recommendation for potential ailments
by providing further questions to ask in order
to generate an objective and accurate diagnosis that is wholistic in
1st Place BCGxGoogle Global Engineering Week Hackathon 2019
The SocialBit measures an individual’s social interactions throughout
the day. A glasses-mounted camera detects faces
and references them with an existing database of the user’s friends on
social media platforms. The data is then
visualized in a network plot and chord diagram that showcases the length
and location of one’s interaction, defined as
the points in time in which the friend’s face was within view. The
is a beautiful and insightful display of our
daily social interactions - at the microscale.
Decuen (pronounced 'DQN') is a Deep Q-Learning Python library that
implements generic reinforcement learning algorithms in OpenAI Gym.
Currently features the DQN
and DDQN agents
as well as implementation of prioritized experience replay for the
Cartpole environment. Also includes tensorboard integration for live
training and validation visualization.
A long term project for me to learn, grow, and explore new research in
the field of deep learning.