# Linear Algebra – The Gram–Schmidt process

Mathematics for Machine Learning: Linear Algebra, Module 4 Matrices make linear mappings

To get certificate subscribe at: https://www.coursera.org/learn/linear-algebra-machine-learning/home/welcome

============================
Mathematics for Machine Learning: Linear Algebra:

About this course: In this course on Linear Algebra we look at what linear algebra is and how it relates to vectors and matrices. Then we look through what vectors and matrices are and how to work with them, including the knotty problem of eigenvalues and eigenvectors, and how to use these to solve problems. Finally we look at how to use these to do fun things with datasets – like how to rotate images of faces and how to extract eigenvectors to look at how the Pagerank algorithm works. Since we’re aiming at data-driven applications, we’ll be implementing some of these ideas in code, not just on pencil and paper. Towards the end of the course, you’ll write code blocks and encounter Jupyter notebooks in Python, but don’t worry, these will be quite short, focussed on the concepts, and will guide you through if you’ve not coded before. At the end of this course you will have an intuitive understanding of vectors and matrices that will help you bridge the gap into linear algebra problems, and how to apply these concepts to machine learning.

Who is this class for: This course is for people who want to refresh their maths skills in linear algebra, particularly for the purposes of doing data science and machine learning, or learning about data science and machine learning. We look at vectors, matrices and how to apply these to solve linear systems of equations, and how to apply these to computational problems.

Created by: Imperial College London

Module 4 Matrices make linear mappings

In Module 4, we continue our discussion of matrices; first we think about how to code up matrix multiplication and matrix operations using the Einstein Summation Convention, which is a widely used notation in more advanced linear algebra courses. Then, we look at how matrices can transform a description of a vector from one basis (set of axes) to another. This will allow us to, for example, figure out how to apply a reflection to an image and manipulate images. We’ll also look at how to construct a convenient basis vector set in order to do such transformations. Then, we’ll write some code to do these transformations and apply this work computationally.

Learning Objectives
• Identify matrices as operators
• Relate the transformation matrix to a set of new basis vectors
• Formulate code for mappings based on these transformation matrices
• Write code to find an orthonormal basis set computationally