0 votes
by (120 points)
What are the sub-topics under the broad category of Linear Algebra as relevant for the learning of AI and ML?

1 Answer

0 votes
by (180 points)
  • Scalars, Vectors, Matrices, and Tensors

  • Vector Norms 

  • Matrix Norms (Frobenius Norm)

  • Dot Product and Inner Products

  • Hadamard Product (Element-wise multiplication)

  • Matrix Multiplication

  • Matrix Transpose and Trace

  • Identity and Diagonal Matrices

  • Symmetric and Orthogonal Matrices

  • Matrix Inversion and Moore-Penrose Pseudoinverse

  • Determinants

  • Linear Independence and Dependence

  • Span and Basis

  • Vector Subspaces (Null Space, Column Space, Row Space)

  • Linear Transformations and Mappings

  • Affine Transformations

  • Matrix Rank

  • Orthogonal Projections

  • Gram-Schmidt Process

  • Systems of Linear Equations (Gaussian Elimination)

  • Eigenvalues and Eigenvectors

  • Eigendecomposition

  • Singular Value Decomposition (SVD)

  • LU Decomposition

  • QR Factorization

  • Cholesky Decomposition

  • Matrix Approximation and Low-Rank Approximation

  • Positive Definiteness and Semi-definiteness

  • Quadratic Forms

  • The Jacobian Matrix

  • The Hessian Matrix

  • Cosine Similarity

  • Change of Basis

Welcome to GateXAIML - GATE DA | GATE CSE | Doubts and Discussions, where you can ask questions and receive answers from other members of the community.
...