Introduction
In this section, we will explore the concept of linear transformations, their definitions, and their fundamental properties. Linear transformations are crucial in various fields, including computer graphics, physics, and engineering, as they provide a mathematical framework for manipulating vectors and points in space.
What is a Linear Transformation?
A linear transformation is a function \( T: V \rightarrow W \) between two vector spaces \( V \) and \( W \) that preserves the operations of vector addition and scalar multiplication. Formally, a function \( T \) is a linear transformation if for all vectors \( \mathbf{u}, \mathbf{v} \in V \) and all scalars \( c \in \mathbb{R} \):
-
Additivity (or Superposition): \[ T(\mathbf{u} + \mathbf{v}) = T(\mathbf{u}) + T(\mathbf{v}) \]
-
Homogeneity (or Scalar Multiplication): \[ T(c\mathbf{u}) = cT(\mathbf{u}) \]
Examples of Linear Transformations
Example 1: Scaling
Scaling is a linear transformation that enlarges or shrinks vectors by a scalar factor. For a vector \( \mathbf{v} = \begin{bmatrix} x \ y \end{bmatrix} \) and a scalar \( k \), the scaling transformation \( T \) is defined as:
\[ T(\mathbf{v}) = k \mathbf{v} = \begin{bmatrix} kx \ ky \end{bmatrix} \]
Example 2: Rotation
Rotation is a linear transformation that rotates vectors around the origin by a certain angle \( \theta \). For a vector \( \mathbf{v} = \begin{bmatrix} x \ y \end{bmatrix} \), the rotation transformation \( T \) is defined as:
\[ T(\mathbf{v}) = \begin{bmatrix} \cos \theta & -\sin \theta \ \sin \theta & \cos \theta \end{bmatrix} \begin{bmatrix} x \ y \end{bmatrix} \]
Example 3: Reflection
Reflection is a linear transformation that flips vectors over a specified axis. For instance, reflecting over the x-axis for a vector \( \mathbf{v} = \begin{bmatrix} x \ y \end{bmatrix} \) is defined as:
\[ T(\mathbf{v}) = \begin{bmatrix} 1 & 0 \ 0 & -1 \end{bmatrix} \begin{bmatrix} x \ y \end{bmatrix} = \begin{bmatrix} x \ -y \end{bmatrix} \]
Properties of Linear Transformations
Property 1: Linearity
As defined, a linear transformation must satisfy both additivity and homogeneity. This ensures that the transformation respects the structure of the vector space.
Property 2: Matrix Representation
Every linear transformation can be represented by a matrix. If \( T: \mathbb{R}^n \rightarrow \mathbb{R}^m \) is a linear transformation, there exists an \( m \times n \) matrix \( A \) such that for any vector \( \mathbf{x} \in \mathbb{R}^n \):
\[ T(\mathbf{x}) = A\mathbf{x} \]
Property 3: Composition
The composition of two linear transformations is also a linear transformation. If \( T_1: V \rightarrow W \) and \( T_2: W \rightarrow U \) are linear transformations, then the composition \( T_2 \circ T_1: V \rightarrow U \) is a linear transformation.
Property 4: Invertibility
A linear transformation \( T: V \rightarrow W \) is invertible if there exists a linear transformation \( T^{-1}: W \rightarrow V \) such that:
\[ T^{-1}(T(\mathbf{v})) = \mathbf{v} \quad \text{for all} \quad \mathbf{v} \in V \]
and
\[ T(T^{-1}(\mathbf{w})) = \mathbf{w} \quad \text{for all} \quad \mathbf{w} \in W \]
Practical Example
Let's consider a practical example in Python to illustrate a linear transformation using matrix representation.
import numpy as np # Define a vector vector = np.array([2, 3]) # Define a transformation matrix (e.g., a scaling matrix) transformation_matrix = np.array([ [2, 0], [0, 2] ]) # Apply the linear transformation transformed_vector = np.dot(transformation_matrix, vector) print("Original Vector:", vector) print("Transformed Vector:", transformed_vector)
Explanation
- We define a vector
vector
as \([2, 3]\). - We define a transformation matrix
transformation_matrix
as a scaling matrix that scales by a factor of 2. - We apply the transformation using the dot product (
np.dot
) to get the transformed vector.
Output
Exercises
Exercise 1
Given the vector \( \mathbf{v} = \begin{bmatrix} 1 \ 2 \end{bmatrix} \) and the transformation matrix \( A = \begin{bmatrix} 3 & 0 \ 0 & 3 \end{bmatrix} \), apply the linear transformation and find the transformed vector.
Solution
import numpy as np vector = np.array([1, 2]) transformation_matrix = np.array([ [3, 0], [0, 3] ]) transformed_vector = np.dot(transformation_matrix, vector) print("Transformed Vector:", transformed_vector)
Output
Exercise 2
Prove that the rotation matrix \( R(\theta) = \begin{bmatrix} \cos \theta & -\sin \theta \ \sin \theta & \cos \theta \end{bmatrix} \) is a linear transformation.
Solution
To prove that \( R(\theta) \) is a linear transformation, we need to show that it satisfies additivity and homogeneity.
-
Additivity:
Let \( \mathbf{u} = \begin{bmatrix} u_1 \ u_2 \end{bmatrix} \) and \( \mathbf{v} = \begin{bmatrix} v_1 \ v_2 \end{bmatrix} \).
\[ R(\theta)(\mathbf{u} + \mathbf{v}) = R(\theta) \begin{bmatrix} u_1 + v_1 \ u_2 + v_2 \end{bmatrix} = \begin{bmatrix} \cos \theta & -\sin \theta \ \sin \theta & \cos \theta \end{bmatrix} \begin{bmatrix} u_1 + v_1 \ u_2 + v_2 \end{bmatrix} \]
\[ = \begin{bmatrix} \cos \theta (u_1 + v_1) - \sin \theta (u_2 + v_2) \ \sin \theta (u_1 + v_1) + \cos \theta (u_2 + v_2) \end{bmatrix} \]
\[ = \begin{bmatrix} \cos \theta u_1 - \sin \theta u_2 \ \sin \theta u_1 + \cos \theta u_2 \end{bmatrix} + \begin{bmatrix} \cos \theta v_1 - \sin \theta v_2 \ \sin \theta v_1 + \cos \theta v_2 \end{bmatrix} \]
\[ = R(\theta)(\mathbf{u}) + R(\theta)(\mathbf{v}) \]
-
Homogeneity:
Let \( c \) be a scalar and \( \mathbf{u} = \begin{bmatrix} u_1 \ u_2 \end{bmatrix} \).
\[ R(\theta)(c\mathbf{u}) = R(\theta) \begin{bmatrix} cu_1 \ cu_2 \end{bmatrix} = \begin{bmatrix} \cos \theta & -\sin \theta \ \sin \theta & \cos \theta \end{bmatrix} \begin{bmatrix} cu_1 \ cu_2 \end{bmatrix} \]
\[ = \begin{bmatrix} c(\cos \theta u_1 - \sin \theta u_2) \ c(\sin \theta u_1 + \cos \theta u_2) \end{bmatrix} \]
\[ = c \begin{bmatrix} \cos \theta u_1 - \sin \theta u_2 \ \sin \theta u_1 + \cos \theta u_2 \end{bmatrix} \]
\[ = c R(\theta)(\mathbf{u}) \]
Thus, \( R(\theta) \) satisfies both additivity and homogeneity, proving it is a linear transformation.
Conclusion
In this section, we have defined linear transformations and explored their fundamental properties. We have seen how linear transformations can be represented using matrices and how they preserve vector space operations. Understanding these concepts is crucial for manipulating vectors and points in 3D space, which we will delve into further in the upcoming sections.
Mathematics 3D
Module 1: Fundamentals of Linear Algebra
- Vectors and Vector Spaces
- Matrices and Determinants
- Systems of Linear Equations
- Eigenvalues and Eigenvectors
Module 2: Linear Transformations
- Definition and Properties
- Transformation Matrices
- Rotations, Translations, and Scalings
- Composition of Transformations
Module 3: Geometry in 3D Space
- Coordinates and Planes
- Vectors in 3D Space
- Dot Product and Cross Product
- Equations of Planes and Lines