Exam Times:

  • MAT 141 Monday 12/8 from 2 to 4:30pm
  • MAT 304 Monday 12/8 from 5:30 to 8pm
  • MAT 531 Tuesday 12/9 from 6 to 8:30pm
  • MAT 133 Friday 12/12 from 2 to 4:30pm

Final’s Week Office Hours:

  • Monday 12/8: 1-2pm
  • Tuesday 12/9: 3-6pm
  • Friday 12/12: 1-2pm

  • Fork or use the Template for the Github repository Mmmmm..Machine_Learning. Here are the slides from the first day discussing this process: Codespaces Slides. The advantage of forking is that you can synchronize it when I make changes, the advantage of using the template is that you have your own independent copy. The Class_Files folder in the Mmmmm..Machine_Learning repository is where I will post demonstrations done in class.

There is a proposed calendar on the syllabus, but here I will record what we actually get through in each class.

  • 11/18 – We started talking about calculus and multiple dimensions. We specifically covered how to think about lines, planes, and hyperplanes. We then did some brief review of calculus, specifically derivatives as slopes and their definition in terms of the limit of difference quotients.
  • 11/11 – We finished the basic content for Unit 1.5. We will do some examples next week and then start Unit 2. The exam for Unit 1.5 will be on 11/25/2025. Look at the material in Lay’s text for section 5 6.5, 7.1-7.5. In particular look at the basic skill type questions (i.e. the first 15 to 20 problems in each section, focus on the odd problems that have the answers in the back of the text.)
  • 11/4 – We started class by reviewing how we can find minimum and maximum values for a quadratic form subject to a constraint such as \(\left\Vert \vec{x}\right\Vert=1\). We spent time discussing column spaces, row spaces, and null spaces for matrices. We also discussed eigenvectors and eigenvalues again. We will be bringing some of this together in the next class.
  • 10/28 – We started looking at the slides on symmetric matrices and quadratic forms. We discussed how we can carry out a change of variables and the Cholesky Factorization. We will look at some additional examples of this at the start of class next week before moving on to optimization.
  • 10/21 – We went over the Unit 1 exam, please go back over this material since it is foundational to the material in Unit 1.5. We finished going over the Orthogonalization and Regression slides; we’ll start the the slides on Symmetric Matrices and Optimization. We will set a date for the Unit 1.5 exam when we have a solid idea of when we will have finished the content; it will likely be on November 17th.
  • 10/14/2025 – Unit 1 Exam
  • 10/7/2025 – Finished material from unit 1
  • 9/30/2025 – We worked through most of the fifth set of slides and will pick up there on the 7th of October.
  • 9/23/2025 – We finished the fourth set of slides which were on Eigenvectors and Eigenvalues. I also said that depending on how close we are to the end of the unit we will either move the date of the exam or change its content and that we would decided that on October 7th.
  • 9/16/2025 – We finished the three sets of slides, these constitute the material you should have seen before coming into the class. We started the fourth slide deck and will continue with that on 9/23. There is a copy of Linear Algebra and its Application 5th Edition by Lay, Lay, and Mcdonald on reserve in the library and below I listed the sections of the text we are covering for each possible exam topic. Currently the exam is set for October 14th, but we will revisit what is will cover and when it will be on 10/7/2025 when we have a better idea of how far we have gotten.
  • 9/9/2025 – We have finished the first two sets of slides listed below. Please be sure to review these before the next class.
  • 9/2/2025: Tonight we started looking at the Linear Algebra review slides. Things moved a little too slowly; please look through the first couple sets of slides before next class. You don’t have to learn the material just try to skim through it so that we can move more quickly next class.

Complete at least 4 of the following project/labs and submit them by the end of final exam week, Friday 12/12/2025. You may complete the others for up to +6% for each additional assignment.

Many/Most of the projects below are from the Projects for Linear Algebra with Applications by Lay

  • Complete this project Power Method for Finding Eigenvalues using Python and NumPy. In addition to the usual commands already introduced to discussed for manipulating matrices you will want to use:
    • \(numpy.linalg.matrix\_power(A, n)\) computes the \(n^{th}\) power of \(A\)
    • \(A[:,i]\) returns the \(i^{th}\) column of \(A\), note that you need to start counting at 0.
  • Complete this project Finding Roots of Polynomials with Eigenvalues using Python and NumPy. In addition to the usual commands already introduced to discussed for manipulating matrices you will want to use:
    • \(numpy.poly(A)\) gives the coefficients of the Characteristic Polynomial of \(A\)
    • \(numpy.linalg.det(A)\) gives the determinant of a matrix \(A\)
    • \(numpy.linalg.qr(A)\) returns \(Q\) and \(R\) for the QR-decomposition, \(A=QR\)
    • \(numpy.triu(A)\) changes all the entries below the main diagonal of \(A\) to zero, you can use this to check if a matrix is upper diagonal

Here are notebooks in the class repository Mmmmm..Machine_Learning that have assignments for you to complete:

  • Linear Regression Practice: Read through the Jupyter Notebook Linear_Regression_Exercise.ipynb, linked to here and in the class repository. Make sure you understand the code and math being carried out, then complete the assignments listed at the top of the notebook.
  • Try-Except Lab: Read through the Jupyter Notebook Try_Except_Default_Exercise.ipynb linked here which is in the class repository.
  • Gram-Schmidt Lab: Read through the Jupyter Notebook
    Gram-Schmidt_Programming_Exercise.ipynb linked here which is in the class repository.
  • Gradient Descent Lab: Read through the Jupyter Notebook discussing gradient descent and try to finish the exercises at the end. Gradient Descent Lab.ipynb

For the exam you should be able to demonstrate facility with or knowledge of the concepts listed here. For each you should go to the exercises in the corresponding text section and try samples of the odd exercises (since the answer is in the back).

  • Diagonalization of Matrices and Eigenvalues/Eigenvectors (Sections 5.1-5.4 in Lay’s Text)
  • Inner Products (Section 6.1 in Lay’s Text)
  • Orthogonal Projections (Sections 6.2-6.3 in Lay’s Text)
  • Gram-Schmidt Orthogonalization Process (Section 6.4 in Lay’s Text)
  • Least-Squares Regression with Matrices (Section 6.5 in Lay’s Text)
  • Matrix Decompositions:
    • LU – Decomposition (Section 2.5 in Lay’s Text)
    • Diagonalization (Sections 5.3 & 7.1 in Lay’s Text)
    • QR – Decomposition (Section 6.4 in Lay’s Text)

For the exam you should be able to demonstrate facility with or knowledge of the concepts listed here. For each you should go to the exercises in the corresponding text section and try samples of the odd exercises (since the answer is in the back).

  • Least-Squares Regression with Matrices (Section 6.5 in Lay’s Text)
  • Symmetric Matrices (Section 7.1 in Lay’s Text)
  • Quadratic Forms and Constrained Optimization (Sections 7.2-7.3 in Lay’s Text)
  • Principal Component Analysis (Section 7.5 in Lay’s Text)
  • Matrix Decompositions:
    • Cholesky Decomposition (Section 7.3 in Lay’s Text)
    • Singular Value Decomposition (Section 7.4 in Lay’s Text)

Roughly speaking we are going to try and cover Chapter 4 in Calculus Volume 3 from Openstax, Strang and Herman (Open Text). For the exam you should be able to demonstrate facility with or knowledge of the following:

  • 5 Questions from Unit 1 Exam
  • 5 Questions from Unit 1.5 Exam
  • Lines, Planes, and Hyperplanes in Higher Dimensions
  • Limits in Higher Dimensions (Section 4.2)
  • Partial Differentiation and the Chain Rule (Sections 4.3 & 4.5)
  • Linear and Quadratic Approximations (Section 4.4)
  • Gradients, Directional Derivatives, and Gradient Descent (Section 4.6)