Schedule

Note: Welcome to EECS 442. Unfortunately, 2 of the 5 optional discussion sections that were originally in the course calendar will not be offered, since they were listed in error (namely, the Friday 10:30am and Thursday 3:30pm sections). Please see here for the sections that are still offered. If this affects you, we apologize for the inconvenience. You are welcome to attend any of the 3 remaining sections, regardless of whether you are officially enrolled in them. Please note that discussion sections largely review course material that was covered during lecture, and thus attendance is completely optional. We will also record them, for those who are unable to attend. We apologize for any inconvenience that this caused, and hope that those who wanted to attend can still do so. Please note that the lecture will still take place at the usual time.


Lecture Date Topic Materials Assignments
Lec. 1 Mon, Aug. 28 Introduction
About the course
Neighborhood filtering
Blurring
Gradient filters
ps1 out (filtering)
Lec. 2 Wed, Aug. 30 Filtering
Convolution and cross-correlation
Edge detection
Nonlinear filtering
Sec. 1 Fri, Sep. 1 Linear algebra and filtering
Mon, Sep. 4 No class - Labor Day
Lec. 3 Wed, Sep. 6 Image pyramids
Gaussian pyramid
Laplacian pyramid
Image statistics
Sec. 2 Fri, Sep. 8 Fourier Transform
Lec. 4 Mon, Sep. 11 Frequency
Image bases
Fourier transform
Lec. 5 Wed, Sep. 13 Machine learning
Nearest neighbor
Linear regression
Overfitting
ps1 due
ps2 out (frequency)
Sec. 3 Fri, Sep. 15 Pyramids and Fourier Transform
Lec. 6 Mon, Sep. 18 Linear classifiers
Logistic regression
Stochastic gradient descent
Lec. 7 Wed, Sep. 20 Neural networks (Guest lecture: Sarah Jabbour)
Nonlinearities
Network structure
Regularization
ps2 due
ps3 out (intro to ML)
Sec. 4 Fri, Sep. 22 Machine learning tutorial
Lec. 8 Mon, Sep. 25 Optimization (Zoom lecture)
Computation graphs
Backpropagation
Momentum
Lec. 9 Wed, Sep. 27 Convolutional networks (Zoom lecture)
Convolution layers
Pooling
Normalization
ps3 due
ps4 out (backprop)
Sec. 5 Fri, Sep. 29 Backpropagation
Lec. 10 Mon, Oct. 2 Image Synthesis with GANs (Guest lecture: Daniel Geng)
Texture synthesis
GANs
Conditional GANs
Lec. 11 Wed, Oct. 4 Image Synthesis with Diffusion (Guest lecture: Sarah Jabbour, Yiming Dou, and Daniel Geng)
VQ-VAEs
Diffusion
ps4 due
ps5 out (scene recognition)
Sec. 6 Fri, Oct. 6 PyTorch tutorial
Lec. 12 Mon, Oct. 9 Object detection
Sliding window
Region-based CNNs
Instance segmentation
Lec. 13 Wed, Oct. 11 Temporal models
3D convolutions
Recurrent networks
LSTMs
ps5 due
ps6 out (image synthesis)
proposal info out
Sec. 7 Fri, Oct. 13 Office hours + GANs
Mon, Oct. 16 No class - Fall Break
Lec. 14 Wed, Oct. 18 Representation learning and language
Autoencoders
Self-supervision
Contrastive learning
Vision and language
Sec. 8 Fri, Oct. 20 Project office hours
    Lec. 15 Mon, Oct. 23 Sound and touch
    Neural nets for other signals
    Multimodal self-supervision
    Lec. 16 Wed, Oct. 25 Image formation
    Camera models
    Projection
    Plenoptic function
    ps6 due
    ps7 out (representation learning)
    Sec. 9 Fri, Oct. 27 Representation learning + project office hours
    Lec. 17 Mon, Oct. 30 Multi-view geometry
    Epipolar geometry
    Stereo vision
    Homographies
    Lec. 18 Wed, Nov. 1 Fitting geometric models
    Finding correspondences
    Fitting a homography
    RANSAC
    Triangulation
    Sec. 10 Fri, Nov. 3 Project office hours
      Lec. 19 Mon, Nov. 6 Structure from motion
      Structure from motion
      Multi-view stereo
      Stereo algorithms
      Lec. 20 Wed, Nov. 8 Motion estimation
      Optical flow
      Aperture problem
      Keypoints
      ps7 due
      ps8 out (panorama stitching)
      Sec. 11 Fri, Nov. 10 Geometry + Project office hours
      Lec. 21 Mon, Nov. 13 Light, shading, and color
      Shape from shading
      Intrinsic images
      Color perception
      Lec. 22 Wed, Nov. 15 Embodied vision
      Learning from demonstrations
      Reinforcement learning
      Sec. 11 Fri, Nov. 17 Project office hours
        Lec. 23 Mon, Nov. 20 Recent advances in network architectures
        Implicit representations
        NeRF
        Vision transformers
        ps8 due
        Wed, Nov. 22 No class - Thanksgiving
        Fri, Nov. 24 No class - Thanksgiving
        Mon, Nov. 27 Image forensics
        Fake images
        Anomaly detection
        Supervised detection
        project guidelines out
        ps9 out (view synthesis)
        Lec. 24 Wed, Nov. 29 Bias
        Datasets
        Algorithmic fairness
        Sec. 12 Fri, Dec. 1 Project office hours
          Lec. 25 Mon, Dec. 4 Recent advances in learning-based motion estimation
          Lec. 26 Wed, Dec. 6 Recent advances in learning-based 3D vision
          ps9 due


          Staff & Office Hours



          Office Hours

          Day Time Name Location
          Monday 11:00am -12:00pm Vishal ChandraEECS 3312
          Tuesday 2:00pm - 3:00pm Yiming DouEECS 3312
          Wednesday 11:00am - 12:00pm Sarah JabbourBBB 3941
          1:00 - 2:00pm Yuhang NingEECS 3312
          Thursday 2:00 - 3:00pm Yueqi WuEECS 3312
          Friday 1:30pm - 2:30pm Andrew OwensEECS 4231

          Office hours will be offered in person. We may add Zoom office hours as the course progresses.


          Course information

          EECS 442 is an introductory computer vision class. Class topics include low-level vision, object recognition, motion, 3D reconstruction, basic signal processing, and deep learning. We'll also touch on very recent advances, including image synthesis, self-supervised learning, and embodied perception.

          Lectures:
          Lectures will take place Monday and Wednesday, 3:00 - 4:30pm. Attendance will not be required, but it is highly encouraged. There are multiple ways to participate:

          • In person in Stamps Auditorium.
          • Live-streamed on Zoom. Please see here for the link. Due to the challenges of holding a hybrid lecture, we will prioritize the in-person experience. We cannot guarantee that we will always answer the questions of Zoom attendees (those messages are unfortunately easy to miss!). In the event of technical problems, we will close the Zoom sesion, and direct attendees to the lecture recording.
          • We'll post lecture recordings online here.

          Discussion section:
          This class has three discussion sections. Please note that two of the sections listed on the course schedule will not be offered (namely, the Friday 10:30am and Thursday 3:30pm sections). You may attend any section, and attendance is not required. We'll post video recordings of one section, for those who are unable to attend.

          Time Place
          Thu 4:30-5:30pm 1303 EECS
          Fri 12:30-1:30pm 220 Chrysler (new location)
          Fri 1:30-2:30pm 1500 EECS (new location)
          Some weeks, we'll host tutorials during these sections, where GSIs will cover a topic in depth. These tutorials appear in the schedule. Attendance to these tutorials is optional, and recordings will be posted online. Other weeks, the discussion section will function as additional office hours and project discussion.

          Prerequisites:

          • This course puts a strong emphasis on mathematical methods. We'll cover a wide range of techniques in a short amount of time. Background in linear algebra is required. For a refresher, please see here. This material should mostly look familiar to you.
          • This class will require a significant amount of programming. All programming will be completed in Python, using numerical libraries such as numpy, scipy, and PyTorch. In some assignments, we'll give you starter code; in others, we'll ask you to write a large amount of code from scratch.

          Google Colab: The problem sets will be completed using Jupyter notebooks, generally using Google Colab. While this service is free, it is important to note that it comes with GPU usage limits. You may only use the GPUs on a given Google account for a certain number of hours per day. These limits are due to the fact that GPUs are very expensive. Since none of the problem sets require training large models, you may never encounter these limits. However, we have provided a few suggestions for avoiding them:

          1. Reduce your GPU usage by debugging your code on the CPU. For example, after confirming that you can successfully complete a single training iteration without an error on the CPU, you can switch to the GPU. You can then switch back to the CPU if you need to debug further errors.
          2. Since many of the machines in the CAEN computer labs have NVIDIA GPUs with 4GB or more of RAM, you can connect to them remotely and train deep learning models. We have provided instructions for using these GPUs here. In our experience, this approach is also less reliable, so we recommend using Colab when possible. Also, since the problem sets are designed for Colab, running them here will require minor modifications, which we’ve described in the tutorial.
          3. We note that the limit is per account (e.g., UMich email or Gmail account).
          4. Consider purchasing Google Colab Pro ($10/month) during the portion of the class where GPUs are required (PS5 and onward; approximately 2 months). For students who would like to use this (optional) service, but are unable to afford it, we have been provided with a small amount of funding from the CSE DEI office. Please send the course staff a private message over Piazza if you would like to learn more about this option.

          Q&A: This course has a Piazza forum, where you can ask public questions. If you cannot make your post public (e.g., due to revealing problem set solutions), please mark your post private, or come to office hours. Please note, however, that the course staff cannot provide help debugging code, and there is no guarantee that they'll be able to answer all questions — especially last-minute questions about the homework. We also greadly appreciate it when you respond to questions from other students! If you have an important question that you would prefer to discuss over email, you may email the course staff (eecs442-fa23-staff@umich.edu), or you can contact the instructor by email directly.

          Homework: There will be homework assignments approximately every week. All programming assignments are to be completed in Python, using the starter code that we provide. Assignments will always be due at midnight (11:59pm) on the due date. The assignments will all be worth approximately equal amounts. Written problems will usually be submitted to Gradescope. You may be asked to annotate your pdf (e.g. by selecting your solution to each problem).

          Final project: In lieu of a final exam, we'll have final project. This project will be completed in small groups during the last weeks of the class. The direction for this project is open-ended: you can either choose from a list of project ideas that we distribute, or you can propose a topic of your own. A short project proposal will be due approximately halfway through the course. During the final exam period, you'll turn in a final report and give a short presentation. You may use an ongoing research work for your final project, as long it meets the requirements.

          Textbook: There are no required textbooks to purchase. We'll be using draft versions of two books:

          • Torralba, Isola, Freeman. Foundations of Computer Vision Draft manuscript chapters provided in class.
          • Szeliski. Computer Vision: Algorithms and Applications, 2nd edition draft (available for free online)

          The following textbooks may also be useful as references:

          • Goodfellow, Bengio, Courville. Deep Learning. (available for free online)
          • Hartley and Zisserman. Multiple View Geometry in Computer Vision.
          • Forsyth and Ponce. Computer Vision: A Modern Approach.

          Acknowledgements: This course uses material from MIT's 6.869: Advances in Computer Vision and its associated textbook manuscript, Foundations of Computer Vision, by Antonio Torralba, Phillip Isola, and William Freeman. It also includes lecture slides from other researchers, including Svetlana Lazebnik, Alexei Efros, David Fouhey, and Noah Snavely (please see acknowledgments in the lecture slides).

          Late policy: You'll have 5 late days to use over the course of the semester. Each time you use one, you may submit a homework assignment one day late without penalty. You are allowed to use multiple late days on a single assignment. For example, you can use all of your days at once to turn in one assignment a week late. You do not need to notify us when you use a late day; we'll deduct it automatically. If you run out of late days and still submit late, your assignment will be penalized at a rate of 1% per hour. If you edit your assignment after the deadline, this will count as a late submission, and we'll use the revision time as the date of submission (after a short grace period of a few minutes). We will not provide additional late time, except under exceptional circumstances, and for these we'll require documentation (e.g., a doctor's note). Please note that the late days are provided to help you deal with minor setbacks, such as routine illness or injury, paper deadlines, interviews, and computer problems; these do not generally qualify for an additional extension.

          Please note that, due to the number of late days available, there will be a long (2+ week) lag between the time of submission and the time that grades are released. We'll need to wait for the late submissions to arrive before we can complete the grading.

          Regrade requests: If you think there was a grading error, you'll have 9 days to submit a regrade request, using Gradescope. This will be a strict deadline, even for significant mistakes such as missing grades, so please look over your graded assignments!

          Support: The counseling and psychological services center (CAPS) provides support for a variety of issues, including for mental health and stress.

          Grading:

          • Grades will be computed as follows, with all homeworks equally weighted:
            Homework 70%
            Final project 30%
          • We'll use these approximate grade thresholds:
            A+ Curved
            A 92%
            A- 90%
            B+ 88%
            B 82%
            B- 80%
            C+ 78%
            C 72%
            C- 70%
            These are lower bounds on letter score grades. For example, if you get an 81%, you will get a B- or better. We may gently curve the class up, in a way that would only improve your letter grade: e.g., after the curve, an 81% might round up to a B, but it would not round down to a C+. To ensure consistency in grading, we will not round (e.g., 87.99% is a B), and we will not consider regrade requests outside of the usual time window.

          Academic integrity: While you are encouraged to discuss homework assignments with other students, your programming work must be completed individually. You must also write up your solution on your own. You may not search for solutions online, or to use existing implementations of the algorithms in the assignments. Please see the Michigan engineering honor code for more information.