Deep Learning Week

Course/Event Essentials

Event/Course Start
Event/Course End
Event/Course Format
Online
Live (synchronous)

Venue Information

Country: Germany
Venue Details: Click here

Training Content and Scope

Scientific Domain
Level of Instruction
Intermediate
Sector of the Target Audience
Research and Academia
Language of Instruction

Other Information

Organiser
Event/Course Description

In this intensive course week, packed with lectures about Deep Learning and AI, you will learn:

  • to train and deploy deep neural networks to solve computer vision problems,
  • the fundamentals of machine learning for working with texts,
  • to use transformer-based natural language processing models for advanced tasks involving languages (e.g., categorising documents),
  • to effectively parallelize training of deep neural networks on single and Multi-GPUs, and
  • how to leverage the LRZ AI Systems to perform all the above tasks. 

This online workshop combines lectures about Fundamentals of Deep Learning for single and for Multi-GPUs, Building Transformer-Based Natural Language Processing Applications and Deep Learning on LRZ systems.

The lectures are interleaved with many demos and hands-on sessions using Jupyter Notebooks. For days 1 to 4, students will have access to a fully configured GPU-accelerated workstation in the AWS cloud. On day 5, the capabilities of the LRZ AI System will be showcased. 

The workshop is co-organised by LRZ and NVIDIA Deep Learning Institute (DLI). Material developed by NVIDIA is supplemented by vendor-neutral material developed by LRZ.

All instructors are NVIDIA certified University Ambassadors.

1st day: Fundamentals of Deep Learning (10:00-16:00 CEST) 

Explore the fundamentals of deep learning by training neural networks and using results to improve performance and capabilities.

During this day, you’ll learn the basics of deep learning by training and deploying neural networks. You’ll learn how to:

  • Implement common deep learning workflows, such as image classification and object detection
  • Experiment with data, training parameters, network structure, and other strategies to increase performance and capability
  • Deploy your neural networks to start solving real-world problems

Upon completion, you’ll be able to start solving problems on your own with deep learning.

2nd day: Fundamentals of Deep Learning for Multi-GPUs (10:00-16:00 CEST) 

The computational requirements of deep neural networks used to enable AI applications like self-driving cars are enormous. A single training cycle can take weeks on a single GPU or even years for larger datasets like those used in self-driving car research. Using multiple GPUs for deep learning can significantly shorten the time required to train lots of data, making solving complex problems with deep learning feasible.

On this day we will teach you how to use multiple GPUs to train neural networks. You'll learn:

  • Approaches to multi-GPUs training

  • Algorithmic and engineering challenges to large-scale training

  • Key techniques used to overcome the challenges mentioned above

Upon completion, you'll be able to effectively parallelize training of deep neural networks using TensorFlow.

3rd and 4th day: Foudamentals of machine learning for working with texts and Building Transformer-Based Natural Language Processing Applications (10:00-16:00 CEST) 

Applications for natural language processing (NLP) have exploded in the past decade. With the proliferation of AI assistants and organizations infusing their businesses with more interactive human-machine experiences, understanding how NLP techniques can be used to manipulate, analyse, and generate text-based data is essential. Modern techniques can capture the nuance, context, and sophistication of language, just as humans do. And when designed correctly, developers can use these techniques to build powerful NLP applications that provide natural and seamless human-computer interactions within chatbots, AI voice agents, and more. Deep learning models have gained widespread popularity for NLP because of their ability to accurately generalize over a range of contexts and languages. Transformer-based models, such as Bidirectional Encoder Representations from Transformers (BERT), have revolutionized NLP by offering accuracy comparable to human baselines on benchmarks like SQuAD for question-answer, entity recognition, intent recognition, sentiment analysis, and more.

In these 2 days, you’ll learn the fundamentals of machine learning for working with texts as well as how Transformer-based natural language processing models for text classification tasks, such as categorizing documents, work. You’ll also learn how to leverage Transformer-based models for named-entity recognition (NER) tasks and how to analyse various model features, constraints, and characteristics to determine which model is best suited for a particular use case based on metrics, domain specificity, and available resources.

By participating in these lectures, you’ll be able to:

  • Understand word tokens and how Tensorflow supports them
  • Understand Recurrent Neural Networks and LSTM for language modelling 
  • Understand how text embeddings have rapidly evolved in NLP tasks
  • See how Transformer architecture features, especially self-attention, are used to create language models without RNNs,
  • Use self-supervision to improve the Transformer architecture in BERT, Megatron, and other variants for superior NLP results,
  • Leverage pre-trained, modern NLP models to solve multiple tasks such as text classification, NER, and question answering,
  • Manage inference challenges and deploy refined models for live applications.

5th day: Deep Learning on LRZ Systems (10:00-16:00 CEST) 

Explore the capabilities of the LRZ AI Systems for performing the any of the tasks learnt during the course.

By participating in this lecture, you will be able to:

  • Understand the resources that the LRZ AI System provides
  • How to allocate resources on the LRZ AI System and provision them with the needed software stack
  • How to interactively work with the LRZ AI Sytems via the terminal and Jupyter Notebooks