Topics
PyTorchBiography
Thomas Viehmann is a PyTorch and Machine Learning trainer and consultant. In 2018 he founded the boutique R&D consultancy MathInf based in Munich, Germany. His work spans low-level optimizations to enable efficient AI to developing cutting-edge deep-learning models for clients from startups to large multinational corporations. He is a PyTorch core developer with contributions across almost all parts of PyTorch and co-author of Deep Learning with PyTorch, to appear this summer with Manning Publications. Thomas’ education in computer science included a class in Neural Networks and Pattern Recognition at the turn of the millennium. He went on to do research in pen-and-paper Calculus of Variations and Partial Differential Equations, obtaining a Ph.D. from Bonn University.
Talk
Over the past two years PyTorch has became the dominant tool for machine learning research, with many of the groundbreaking advancements appearing alongside their PyTorch implementations. Having at least a basic understanding of the library is an asset, as it allows one to easily collaborate with others, develop their own research faster or at least gain a deeper understanding of the resources published every day online.
This course will be a gentle introduction to the PyTorch library and we will go over all of its fundamental abstractions. Those include the way model code is usually structured, how can one go about computing gradients of arbitrary Python functions automatically, making effective use of accelerators such as GPUs and what are the best practices for research implementations. If time allows, we will also take a peek into some more advanced features like the just-in-time compiler.
Over the past two years PyTorch has became the dominant tool for machine learning research, with many of the groundbreaking advancements appearing alongside their PyTorch implementations. Having at least a basic understanding of the library is an asset, as it allows one to easily collaborate with others, develop their own research faster or at least gain a deeper understanding of the resources published every day online.
This course will be a gentle introduction to the PyTorch library and we will go over all of its fundamental abstractions. Those include the way model code is usually structured, how can one go about computing gradients of arbitrary Python functions automatically, making effective use of accelerators such as GPUs and what are the best practices for research implementations. If time allows, we will also take a peek into some more advanced features like the just-in-time compiler.