Skip to content

jeffheaton/app_deep_learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

T81 558: Applications of Deep Neural Networks

Washington University in St. Louis

Instructor: Jeff Heaton

  • Section 1. Spring 2026, Tuesday, 2:30 PM
    Location: LOUDERMAN, Room 00461

Course Description

Deep learning is a group of exciting new technologies for neural networks. Through a combination of advanced training techniques and neural network architectural components, it is now possible to create neural networks that can handle tabular data, images, text, and audio as both input and output. Deep learning allows a neural network to learn hierarchies of information in a way that is like the function of the human brain. This course will introduce the student to classic neural network structures, Convolution Neural Networks (CNN), Long Short-Term Memory (LSTM), Gated Recurrent Neural Networks (GRU), Generative Adversarial Networks (GAN) and reinforcement learning. Application of these architectures to computer vision, time series, security, natural language processing (NLP), and data generation will be covered. High Performance Computing (HPC) aspects will demonstrate how deep learning can be leveraged both on graphical processing units (GPUs), as well as grids. Focus is primarily upon the application of deep learning to problems, with some introduction to mathematical foundations. Students will use the Python programming language to implement deep learning using PyTorch. It is not necessary to know Python prior to this course; however, familiarity of at least one programming language is assumed. This course will be delivered in a hybrid format that includes both classroom and online instruction.

Objectives

  1. Explain how neural networks (deep and otherwise) compare to other machine learning models.
  2. Determine when a deep neural network would be a good choice for a particular problem.
  3. Demonstrate understanding of the material through applied programming assignments and a Kaggle competition.

Syllabus

This syllabus presents the expected class schedule, due dates, and reading assignments.
Download current syllabus

Module Content
Module 1
Meet on 01/12/2026
Module 1: Python Preliminaries
  • 1.1 Course Overview
  • 1.2 Introduction to Python
  • 1.3 Python Lists, Dictionaries, Sets & JSON
  • 1.4 File Handling
  • 1.5 Functions, Lambdas, and Map/Reduce
  • We will meet on campus this week (in-class meeting #1)
Module 2
Week of 01/19/2026
Module 2: Python for Machine Learning
  • 2.1 Introduction to Pandas for Deep Learning
  • 2.2 Encoding Categorical Values
  • 2.3 Grouping, Sorting, and Shuffling
  • 2.4 Apply and Map
  • 2.5 Feature Engineering
  • Module 1 Program due: 01/21/2026
  • Icebreaker due: 01/21/2026
Module 3
Meet on 01/26/2026
Module 3: PyTorch for Neural Networks
  • 3.1 Deep Learning Overview
  • 3.2 Introduction to PyTorch
  • 3.3 Feature Vector Encoding
  • 3.4 Early Stopping and Persistence
  • 3.5 Sequences vs Classes
  • Module 2 Program due: 01/27/2026
  • We will meet on campus this week (in-class meeting #2)
Module 4
Week of 02/02/2026
Module 4: Training for Tabular Data
  • 4.1 K-Fold Cross-Validation
  • 4.2 Training Schedules
  • 4.3 Dropout
  • 4.4 Batch Normalization
  • 4.5 RAPIDS for Tabular Data
  • Module 3 Program due: 02/03/2026
Module 5
Week of 02/09/2026
Module 5: CNN and Computer Vision
  • 5.1 Image Processing
  • 5.2 Convolutional Neural Networks
  • 5.3 Pretrained Networks
  • 5.4 Image Augmentation
  • 5.5 YOLO
  • Module 4 Program due: 02/10/2026
Module 6
Meet on 02/16/2026
Module 6: ChatGPT and Large Language Models
  • 6.1 Transformers
  • 6.2 ChatGPT API
  • 6.3 LLM Memory
  • 6.4 Embeddings
  • 6.5 Prompt Engineering
  • Module 5 Program due: 02/17/2026
  • We will meet on campus this week (in-class meeting #3)
Module 7
Week of 02/23/2026
Module 7: Image Generative Models
  • 7.1 Generative AI
  • 7.2 StyleGAN3
  • 7.3 DeOldify
  • 7.4 Stable Diffusion
  • 7.5 DreamBooth
  • Module 6 Program due: 02/24/2026
Module 8
Meet on 03/02/2026
Module 8: Kaggle
  • 8.1 Introduction to Kaggle
  • 8.2 Ensembles
  • 8.3 Hyperparameters
  • 8.4 Bayesian Optimization
  • 8.5 Semester Kaggle
  • Module 7 Program due: 03/03/2026
  • We will meet on campus this week (in-class meeting #4)
Module 9
Week of 03/16/2026
Module 9: Facial Recognition
  • 9.1 Face Detection
  • 9.2 Facial Features
  • 9.3 Image Augmentation
  • 9.4 Emotion Detection
  • 9.5 Blink Efficiency
  • Module 8 Program due: 03/17/2026
Module 10
Week of 03/23/2026
Module 10: Time Series in PyTorch
  • Time Series Encoding
  • Seasonality and Trend
  • LSTM Time Series
  • CNN Time Series
  • Meta Prophet
  • Module 9 Program due: 03/24/2026
Module 11
Week of 03/30/2026
Module 11: Natural Language Processing
  • 11.1 NLP Overview
  • 11.2 Hugging Face
  • 11.3 Tokenizers
  • 11.4 Datasets
  • 11.5 Model Training
  • Module 10 Program due: 03/31/2026
Module 12
Week of 04/06/2026
Module 12: Reinforcement Learning
  • Gymnasium
  • Q-Learning
  • Stable Baselines
  • Atari Games
  • Future of RL
Module 13
Week of 04/13/2026
Module 13: Deployment and Monitoring
  • 13.1 Denoising Autoencoders
  • 13.2 Anomaly Detection
  • 13.3 Model Drift
  • 13.4 TPUs
  • 13.5 Future Directions
  • Kaggle Competition Closes: 04/19/2026 (midnight)
  • Kaggle Assignment due in Canvas: 04/21/2026
Week 14
Week of 04/20/2026
Wrapup Discuss final Kaggle results and future directions of this technology.

Datasets

About

T81-558: PyTorch - Applications of Deep Neural Networks @washington University in St. Louis

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •