Mathematics for Data Science II

An advanced course in linear algebra, multivariable calculus, and optimization, tailored for applications in machine learning and data science.

Building on the foundations of the first course, Mathematics for Data Science II delved into the core mathematical machinery that powers machine learning. The course was a deep exploration of linear algebra, starting with vectors and matrices and progressing through vector spaces, bases, and linear transformations. I gained a strong understanding of concepts like orthogonality and the Gram-Schmidt process. The second half of the course transitioned to multivariable calculus, which was crucial for understanding optimization. We covered partial derivatives, gradients, and the Hessian matrix, learning how to find critical points and local extrema for multivariable functions—skills that are directly applicable to training machine learning models.


Instructor

Prof. Sarang S Sane, Department of Mathematics, IIT Madras


Course Schedule & Topics

The course is structured over 12 weeks, with a final week for review, focusing on linear algebra and multivariable calculus.

Week Primary Focus Key Topics Covered
1 Fundamentals of Linear Algebra Vectors, matrices, systems of linear equations, and determinants.
2 Solving Linear Systems Cramer’s Rule, echelon form, row reduction, and the Gaussian elimination method.
3 Vector Spaces Introduction to abstract vector spaces, properties, linear dependence, and linear independence.
4 Basis & Dimension Defining a basis for a vector space, finding bases, understanding rank and dimension, and using Gaussian elimination to compute them.
5 Rank, Nullity & Linear Maps The null space of a matrix (nullity), introduction to linear mappings and transformations.
6 Properties of Linear Transforms Matrix representations of linear transformations, finding the kernel and image of linear transformations and their bases.
7 Inner Product Spaces Equivalent and similar matrices, affine subspaces, defining inner products, norms, lengths, and angles in a vector space.
8 Orthogonality Orthogonal and orthonormal bases, projections using inner products, the Gram-Schmidt process, and orthogonal transformations.
9 Multivariable Calculus I Visualizing multivariable functions, partial derivatives, directional derivatives, gradients, limits, and continuity.
10 Optimization Concepts Direction of steepest ascent/descent, tangent (hyper)planes, and finding critical points for multivariable functions.
11 Multivariable Calculus II Higher-order partial derivatives, the Hessian matrix, using the Hessian to identify local extrema, and differentiability.
12 Revision Week Comprehensive review of all linear algebra and multivariable calculus topics covered in the course.

Material used