## Min Lin

### National University of Singapore

##### News: Sea Industrial PhD Program

Sea AI Lab sponsors Industrial PhD scholarship, apply if you’re interested.

I’m a Principal Research Scientist at the Sea AI Lab, at the same time I’m an Adjunct Assistant Professor at National University of Singapore.

I started my exploration in deep learning around 2013. Started on neural network architecture design, distributed deep learning systems and neural ODE topics during my PhD. Later when I joined Qihoo 360, I applied deep learning on several product lines including search engine, recommendation systems. During my postdoc at Mila, I spent several years working on continual learning (the setting I’m after is actually closer to online learning), and that triggered me to think that we need to go beyond deep learning, i.e. It is not straight-forward how online learning can be achieved with the “data in gradient out” pattern of deep learning.

There are two main themes in my current research at Sea AI Lab,

• Exploitation of deep learning to solve problems.

• Systems for ML and ML for Systems.
• Ab Initio Quantum Chemistry methods, including Quantum Monte Carlo and Density Functional Theory.
• Exploration of paradigms beyond deep learning.

• I’m optimistic about combining non-parametric methods and generative model to solve online learning problems.
Interests
• Ab Initio Quantum Chemistry
• ML Systems
• Online Learning
• Non-parametric Methods
• Generative Models
• Reverse Engineering
Education
• Ph.D. in Electrical and Computer Engineering, 2016

National University of Singapore

• B.Sc. in Life Sciences, 2010

Peking University

# Experience

Jun 2022 – Present Singapore
Principal Research Scientist
Feb 2021 – Present Singapore
Postdoc
Dec 2017 – Jan 2021 Montreal
Engineer
Sep 2015 – Nov 2017 Beijing

# Publications

Quickly discover relevant content by filtering publications.
(2022). $O (N^ 2)$ Universal Antisymmetry in Fermionic Neural Networks. arXiv preprint arXiv:2205.13205.

(2022). Causal Representation Learning for Out-of-Distribution Recommendation. Proceedings of the ACM Web Conference 2022.

(2022). EnvPool: A Highly Parallel Reinforcement Learning Environment Execution Engine. arXiv preprint arXiv:2206.10558.

(2022). Robustness and Accuracy Could Be Reconcilable by (Proper) Definition. arXiv preprint arXiv:2202.10103.

(2021). How Should Pre-Trained Language Models Be Fine-Tuned Towards Adversarial Robustness?. Advances in Neural Information Processing Systems.

(2020). Continual Learning from the Perspective of Compression. arXiv preprint arXiv:2006.15078.

(2020). Online fast adaptation and knowledge accumulation: a new approach to continual learning. arXiv preprint arXiv:2003.05856.

(2019). Conditional computation for continual learning. arXiv preprint arXiv:1906.06635.

(2019). Gradient based sample selection for online continual learning. Advances in neural information processing systems.

(2019). On the spectral bias of neural networks. International Conference on Machine Learning.

(2019). Online continual learning with maximal interfered retrieval. Advances in neural information processing systems.

(2018). On the Spectral Bias of Deep Neural Networks..

(2017). Softmax gan. arXiv preprint arXiv:1704.06191.

(2016). NUS-PRO: A New Visual Tracking Challenge. IEEE transactions on pattern analysis and machine intelligence.

(2015). HCP: A Flexible CNN Framework for Multi-label Image Classification. IEEE Transactions on Pattern Analysis and Machine Intelligence.

(2015). Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems.

(2014). Programming a Pavlovian-like conditioning circuit in Escherichia coli. Nature communications.

(2014). Purine: A bi-graph based deep learning framework. International Conference on Learning Representations Workshop.

(2013). Correntropy induced l2 graph for robust subspace clustering. Proceedings of the IEEE International Conference on Computer Vision.

(2013). Network In Network. International Conference on Learning Representations.