How to keep up-to-date in ML

There is a massive amount of material about machine learning available online, growing every day. In the following, I give some personal recommendations of places I use to update myself and learn. All sources are available online and free.

Websites and Blogs

aiindex.org: This is a yearly study on the actual status of Artificial Intelligence. It is a project within the Stanford 100 Year Study on AI. It is an initiative to track, collate, distill, and visualize data relating to the actual status of artificial intelligence. A yearly study is published with leading AI researchers and representatives from the industry as authors.

arXiv.org: Internet-based public library of scientific papers managed by the Cornell University. Actual papers on different topics (like physics, mathematics, computer science) are publicly available and can be downloaded as pdf. Papers are reviewed before publication. If you are looking for a particular ML paper, the chances are high that you will find it here. The website is offering an advanced search capability.

Berkeley AI Research (BAIR): Great ML and AI blog from the University of California, Berkley.

DeepMind Research Blog: A blog from Google’s DeepMind about current research topics.

DeepMind Research: Page with all research papers published by DeepMind.

distill.pub: A website to publish machine learning research work online. Led by the editors Shan Carter and Chris Olah from the OpenAI. The steering committee is first class with members like Ian Goodfellow, Joshua Bengio, and Andrej Karpathy. The articles are thoroughly reviewed and the layout is web-native with beautiful animated gifs to illustrate the content.

Facebooks AI Blog: Announcements and high-level descriptions of Facebook’s new AI research topics and publications. On their publication page, you can find all published papers.

Google’s AI Blog: A blog where Google’s AI researchers regularly write about new research topics.

Google Research Publication Database: Online database of all research papers published by Googlers. All papers can be freely downloaded.

The Gradient: Digital magazine that aims to be a place for discussion about research and trends in artificial intelligence and machine learning.

Journal of Machine Learning Research (www.jmlr.org): The Journal of Machine Learning Research (JMLR) provides an international forum for the electronic and paper publication of high-quality scholarly articles in all areas of machine learning. All published papers are freely available online.

Microsoft: Microsoft offers a high level and more product-related AI blog and an AI research blog including links to publications.

MIT offers two different blogs on AI: The MIT News on AI and the MIT Technology Review on AI. Both are writing about new topics and research in AI and are fairly high-level.

Stanford AI Lab (SAIL): Great blog from the Stanford AI Lab about new research and publications.

Podcasts

Lex Friedman is an ML and AI researcher at MIT. He conducts interviews with highly reputable individuals in AI or related fields like Elon Musk, Noam Chomsky, Roger Penrose, Donald Knuth, and many more.

Machine Learning Street Talk is a YouTube Channel and a podcast managed by Yannic Kilcher, Tim Scarfe, and Keith Duggar. It is quite a technical podcast where they interview authors of recent ML research papers and discuss topics such as AGI, AI Ethics, and ML DevOps.

twimlai.com (this week in Machine Learning and AI): Sam Charrington’s podcast series is focused on the business and consumer application of machine learning and AI. His guests are leading experts from research and industry.

Newsletters

The Batch: Weekly newsletter from Andrew Ng and deeplearning.ai with the latest news, breakthroughs, and events in the area of Machine Learning and AI.

ML Conferences

International Conference on Learning Representations (ICLR) is a well known and respected international conference on ML. ICLR publishes all accepted papers here including the paper, code, the slides of the presentation and short videos of the talks.

NeurIPS is the leading ML conference and publishes all papers here.

Videos

CS224N – Natural Language Processing with Deep Learning: 18 lectures by Professor Christopher Manning from the University of Stanford. Recordings available on YouTube (link to the recordings). Additional material is available on the course’s homepage.

CS229 – Machine Learning – held by Andrew Ng: 20 lectures from the University of Stanford. Recordings available on YouTube (lecture 1).

CS231n – Convolutional Neural Networks for Visual Recognition: 10-weeks course from the University of Stanford (lecture 1). Video recordings of the lectures from winter 2016 are available on YouTube. Some of the lectures have been held by Andrej Karpathy (actually working at Tesla).

Public Lecture: Deep Learning and the Future of Artificial Intelligence – held by Yann LeCun: Great introductory lecture about the history, the actual state, and the possible future of Neural Networks with a focus on Convolutional Neural networks (CNN).  (link).

Yannic Kilcher’s YouTube Channel: Yannic is a Ph.D. student at ETH in Zurich, Switzerland. He has a YouTube Channel with videos where he explains important and recent ML research papers. Yannic is able to explain complex topics in a funny and easy to understand way, but never losing scientific rigor.

Books

Goodfellow, Ian, Bengio, Yoshua, Courville, Aaron. Deep Learning, MIT Press, 2016. Website: www.deeplearningbook.org. An excellent book about the actual status of deep learning including a profound mathematical foundation.

Jurafsky, Dan and Martin, James H. Speech and Language Processing, 3rd Edition Draft. Dan is Chair and Professor at Stanford University, James is Professor at Boulder University. This is a leading book on Speech and Language Processing. The 3rd edition is in draft mode, not all chapters are available yet. All available chapters can be downloaded as text (pdf) or slides (pdf or ppt) and can be used freely.

Leave a Reply

Your email address will not be published.