Hi everyone. This is Professor Yun from KAIST. Now we're studying about mathematics for the AI beginner Part 1, linear Algebra. This is Session 2 of Week 6. The last session, we study about what is the diagonalization meaning for the matrix A and what is the condition for the diagonalization. Also, we finish up with one example. Let's study more deeper in a diagonalization with different example. Here I bring this one matrix A, which is a 3 by 3 matrix, 1, 0, 0, 2, 1, 0, and 3, 4, 2. Diagonalize matrix A, we need to check whether this matrix A can be diagonalized or not. In this example A, this matrix A also has only two distinct eigenvalue, which is 1 and 2 like in the same situation is Session 1. Let's look at eigenvectors. To have a diagonalization, here the question is we need to check. Can you diagonalize the matrix A means, can you find three linearly independent eigenvectors? We need to check whether we can have three linearly independent eigenvector here. For the eigenvalue Lambda equals 1, we can find eigenvectors x, y, z equals s, 0, 1, 4. For the eigenvalue Lambda equals 2, we can find x, y, z equals t times 0, 0, 1. For the first eigenvalue Lambda equals 1, we can take s equals 1 to have one eigenvectors, so 0, 1 minus 4. If you use other numbers s other than one, for example, s equals 2, 3, then it cannot be a linearly independent eigenvector because if you choose other numbers, then that eigenvectors will be the linearly dependent eigenvector of 1 minus 4. We cannot take another linearly eigenvector from Lambda equals 1. For lambda equals 2, let's take t equals 1 to obtain 0, 0, 1. Here again, we cannot take another eigenvector from here because if you take another eigenvector, it will be the linearly dependent eigenvector of 0, 0, 1. Here, we only can have two linearly independent eigenvector, so it means we cannot formulate the diagonalization for matrix A. Here we can easily check whether this matrix A can be diagonalized or not. Good news is that if we have eigenvector which is distinct eigenvalues, which means if you have M by M matrix and distinct eigenvalue, for example, if you have 3 by 3 matrix and three different distinct eigenvalues and those eigenvector from eigenvalues are linearly independent, so we don't need to check it. If N by N matrix A has N distinct eigenvalues, and you can select one eigenvector from each eigenvalues to form N linearly independent eigenvectors. In this case, matrix A is diagonalizable. However, as shown in the last two example, last two exemplary is 3 by 3 matrix, but only two eigenvalues. If a matrix N by N, matrix A has less than N different eigenvalues, then it may or may not be diagonalizable. Depending on whether we can find n linearly independent eigenvector. This is different case. If you encounter some problem and for the diagonalization, you need to check whether you can have the linearly independent eigenvectors. Trick, not trick, another good news is that if A is symmetric matrix, then it may be possible to construct matrix P in a such way that P inverse can be easily found. Before we get into this theorem, I want introduce about transpose of matrix A is denoted by A^T. If matrix A is 1, 2, 3, 4, 5, 6, 7, 8, 9, and A^T inverse is what? This first column, 1, 2, 3, will be the first row of matrix A^1, 2, 3 becomes first column of matrix A^T, transpose A. Second row of matrix A become second column or A^T, which is transpose T and third row of matrix A, which is 7, 8, 9, will be the third column of transpose A which is 7, 8, 9 like this. Definition of symmetric matrix is that if matrix A equals transpose A, then we say A is symmetric matrix. For example here, A is 1, 3, 5, 3, 0, 6, 5, 6, 9, and A transpose is what? It's 1, 3, 5. First row of A become first column of A^T and second row, 3, 0, 6 become the second column of transpose A and third row, 5, 6, 9 of A, become solid column of transpose A. Then you can find this 3, 6, and 5 are symmetric to the diagonal direction. This is definition of symmetric matrix. Here is a theorem for the symmetric matrix relation to the diagonalization. If A is an N by N symmetric matrix, then those elements are real numbers, then A has only real eigenvalues. Furthermore, if the symmetric matrix A has N different distinct eigenvalues, which is Lambda 1, Lambda 2 to Lambda N with unit normal eigenvectors X_1, X_2 to X_N and respectively, then we can form the P also on a matrix such that the P inverse equals P transpose. We can write A equals PDP inverse to the PDP transpose. Let's study this theorem with example. To study this theorem, we need to understand what is norm of vector X. The norm of vector X is denoted by X and like these two bars on the left-hand side and right side. Norm of vector is similar to the absolute value of the scalar number. For example, if vector X equals X_1, X_2 to the X_N, then norm of X is root of square of X_1 squared plus X_2 squared plus to the X_N squared. The norm of vector X also relate with the length of vector X. Let's diagonalize A, which is 1, 0, 1; 0, 1, 1; 1, 1, 0. As you see, the matrix A is symmetric matrix. Those elements are the symmetry to the diagonal elements. We can use the theorem like this. Let's find the eigenvalue first. We have three different eigenvalue here, Lambda equals 1. Then the corresponding eigenvector is minus t, t, 0. Then we should find a unit norm eigenvector which means norm is 1. Unit norm means norm of this eigenvector is 1. What is norms of this eigenvector? Root to square of minus t square plus t square plus 0 square equals 1. Then you can find t equals 1 over root square of 2. We can choose the unit norm eigenvectors of Lambda equals 1 is what? One over root square 2 and minus 1, 1, 0. Unit norm means that the value of norm equals 1. You call that as unit norm. So you can find unit norm eigenvectors of Lambda equals 1. In same way for the Lambda equals minus 1, the eigenvector is minus p over 2, minus p over 2p. We can find unit norm which is root square of p over 2 squared plus minus p over 2 squared plus p square equals 1. Here, p equals 2 over root square of 6. We can take p equals 2 over root square 6. Then we can find unit norm the eigenvector is 1 over root square of 6 minus 1, minus 1, 2. For the third eigenvalue, Lambda equals 2. The corresponding eigenvector is s, s, s. We can find unit norm eigenvector as 1 over root square 3, 1, 1, 1. Now, we have unit norm eigenvectors for the corresponding eigenvalues. You need norm eigenvectors for the corresponding eigenvalues. We can now finally do the diagonalization. Because A is symmetric and it has three distinct eigenvalues, which is Lambda equals 1, Lambda equals minus 1, Lambda equals 2. We calculate the unit norm eigenvectors. By using this unit norm eigenvectors, we can formulate orthogonal matrix like this. You just place first unit norm eigenvector in the first column for the orthogonal matrix P, and second eigenvector to the second column for P, third eigenvector for third column for the P. Then D, we just place in a right place. For the first eigenvector, first eigenvalues Lambda equal 1 became the first element for the first column of diagonal matrix D here, so 1, minus 1, 2 like this. You should please remember that you should place in the right position for the corresponding eigenvector and eigenvalue. Many case, students just put randomly, wherever. They put the first eigenvector in first column, but they put a different eigenvalue in there. They make a diagonal matrix T. By using the theorem, P inverse is what? Just P transpose. We don't need to calculate P inverse. Actually the two find the P inverse from P by using the row operation, it's very complicated. By using this theorem, we don't need to do the very complicated process. We just can find P inverse by using this theorem. Up to here, we study about some special case for the diagonalization. This is end of the session. We are going to study further level for the diagonalization in the session 3. Thank you.