CSCI 680 Spring 2025: Deep Transfer Learning

Overview

Deep Transfer Learning is an advanced graduate-level course that explores the intersection of deep learning and transfer learning techniques in the field of machine learning. Specifically, this course focuses on the two major advances in the field of deep transfer learning in recent years: unsupervised domain adaptation and domain generalization. This course equips students with the knowledge and practical skills necessary to develop deep learning algorithms generalizable to new data. It delves into the practical methodologies and cutting-edge research developments in the domain of deep transfer learning.

Useful Links

Important Dates

Agenda Item Due Date Time Location
Final Project Report 05/13 23:59 Blackboard
Final Exam TBA TBA Small Physics Lab 235 (tentative)

Syllabus

The syllabus is available.

Where and When

Class will be held synchronously every week, including a combination of lectures and office hours. Students are encouraged to attend both the lectures and office hours each week. There will be two mandatory tests held. Students are also encouraged to reach out to the instructor(s) with any questions or concerns.

Instructor Lecture Time Lecture Location Office Hours Office Location Email
Ashley Gao M/W: 14:00-15:20 Small Physics Lab 235 T/R: 11:00-12:30 McGlothlin-Street Hall 004 ygao18@wm.edu

Homework(s)

This class will have 1 homework. Please come back to this page once it is annouced during the class that a homework is posted. The homeworks are collected using Blackboard.

The homework is a programming assignment. They will test your knowledge on the basics of deep learning as well as deep transfer learning (unsupervised domain adaptation and domain generalization). They will not be about reviews about your paper reading.

# Out Due Materials
1 Jan 21 May 13 [Homework]

Grading

Lecture Schedule

Note that this schedule is tentative and will be updated once a topic is covered in the lecture(s).

Suggested readings are optional; they are resources we recommend to help you understand the course material. All of the textbooks listed below are freely available online.

Note that this schedule is tentative and will be updated once a topic is covered in the lecture(s).

Bishop = Pattern Recognition and Machine Learning, by Chris Bishop.
ESL = The Elements of Statistical Learning, by Hastie, Tibshirani, and Friedman.
MacKay = Information Theory, Inference, and Learning Algorithms, by David MacKay.
Barber = Bayesian Reasoning and Machine Learning, by David Barber.
MC = Multivariate Calculus, on Youtube.

# Dates Topic Materials Instructor Notes
1 01/27 Lecture: Logistics & Introduction to Deep Transfer Learning Lecture: [Lecture]
Zhu, et al (2023): Visual Domain Adaptation and Generalization
2 01/29, 02/03, 02/05 Lecture: Linear Regression, Optimization
Tutorial: Linear Regressor with SGD
Lecture: [Lecture]
Tutorial: [Tutorial]
Bishop: 3.1
ESL: 3.1 - 3.2
3 02/10, 02/12, 02/17 Lecture: Logistic Regression, Multiclass Classification
Tutorial: Logistic Regression
Lecture: [Lecture]
Tutorial: [Tutorial]
Bishop: 4.1, 4.3
ESL: 4.1-4.2, 4.4, 11
MC: Partial Derivatives - Multivariable Calculus
4 02/24, 02/27 Lecture: Multilayer Perceptrons
Tutorial: MLP
Lecture: [Lecture]
Tutorial: [Tutorial]
Bishop: 5.1-5.3
5 03/03 Lecture: Convolutional Neural Networks
Tutorial: CNN
Lecture: [Lecture]
Tutorial: [Tutorial]
Convolutional Neural Networks
6 03/03 Lecture: Recurrent Neural Networks
Tutorial: RNN
Lecture: [Lecture]
Recurrent Neural Networks
7 03/03 Lecture: Attention and Transformers
Tutorial: Vision Transformers
Lecture: [Lecture]
Tutorial: [Tutorial]
Transformers
8 03/05 Lecture: Philosophy of ML/AI
Lecture: [Lecture]
Gao, 2019: Personal Statement

Final Project

25% of your total mark is allocated to a final project, which will require you to apply several algorithms to a challenge problem and to write a short report analyzing the results. The final project is an individual project, meaning that you are not allowed to collaborate with others.