--> Skip to main content

Total Pageviews

30,331

Maths 2 Week 3

Understanding Vector Spaces in Linear Algebra

Understanding Vector Spaces in Linear Algebra

1. Basics: Set Theory and Boolean Operations

A set is a collection of distinct objects, represented by S. Common set operations include:

  • Union (AB)
  • Intersection (AB)
  • Complement (A)

2. Groups

A group (G,) is a set G with an operation * that satisfies:

  • Closure: a,bG,abG
  • Associativity: a,b,cG,(ab)c=a(bc)
  • Identity: There exists an element eG such that ea=ae=a for all aG
  • Inverse: For each aG, there exists bG such that ab=e

3. Rings

A ring (R,+,) is a set equipped with two operations (addition and multiplication) satisfying:

  • Closure under addition and multiplication
  • Additive associativity
  • Multiplicative associativity
  • Additive identity (0)
  • Distributive properties of multiplication over addition

4. Fields and Their Properties

A field F is a set with two operations (addition and multiplication) that satisfies all ring properties and:

  • Multiplicative identity (1)
  • Multiplicative inverses: For each a0 in F, there exists a1F such that aa1=1
  • Example fields: R,Q,C

Our syllabus starts here: with the concept of a *vector space*. While Groups, Rings, and Fields belong to abstract algebra, understanding Fields is essential for defining vector spaces. A vector space uses elements from a Field, allowing us to perform addition, subtraction, multiplication, and division (except by zero) with predictable behavior. By combining elements from a Field with operations like vector addition and scalar multiplication, we create the structure known as a *vector space*, which forms the foundation of linear algebra. This is where our exploration of linear transformations, solutions to linear equations, and higher-dimensional spaces begins.

👉5. Vector Spaces and Field Interaction

A vector space V over a field F consists of elements called vectors that can be scaled by elements from F. The axioms for a vector space include:

Vector Space Axioms

A vector space V over a field F must satisfy the following eight axioms:

  1. Closure under Addition: For all u,vV, the sum u+v is also in V.
  2. Closure under Scalar Multiplication: For all cF and vV, the product cv is also in V.
  3. Associativity of Vector Addition: For all u,v,wV, (u+v)+w=u+(v+w).
  4. Commutativity of Vector Addition: For all u,vV, u+v=v+u.
  5. Existence of Additive Identity: There exists an element 0V such that v+0=v for all vV.
  6. Existence of Additive Inverses: For each vV, there exists an element vV such that v+(v)=0.
  7. Distributive Property of Scalar Multiplication with Respect to Vector Addition: For all cF and u,vV, c(u+v)=cu+cv.
  8. Distributive Property of Scalar Multiplication with Respect to Field Addition: For all c,dF and vV, (c+d)v=cv+dv.
  9. Associativity of Scalar Multiplication: For all a,bF and vV, (ab)v=a(bv).
  10. Multiplicative Identity: For all vV, 1v=v, where 1 is the multiplicative identity in F.

6. Examples of Vector Spaces: R2 and R3

In R2 and R3, vectors are ordered pairs or triples, respectively:

  • In R2: v=(xy)
  • In R3: v=(xyz)

👉7. Linear Dependence and Independence

Vectors are linearly dependent if one can be written as a combination of others; they are independent otherwise.

Linearly Independent vs Dependent Vectors

1. All Scalars Zero (Trivial Solution)

Independent Vectors: If the only solution to the linear combination of vectors being equal to the zero vector is when all the scalars are zero, then the vectors are linearly independent.

Dependent Vectors: If there exists a non-trivial solution where some scalars (other than zero) result in the linear combination being the zero vector, the vectors are linearly dependent.

Example: Vectors v1 = (1, 2) and v2 = (2, 4) are dependent because v2 is a scalar multiple of v1 (i.e., v2 = 2 × v1).

2. At Least One Scalar is Non-Zero (Dependent)

Dependent Vectors: If there is a way to express one vector as a linear combination of others (where at least one scalar is non-zero), the vectors are linearly dependent.

Independent Vectors: If no vector in the set can be written as a linear combination of the others, all the scalars in the linear combination must be zero for the result to be the zero vector.

Example: Vectors v1 = (1, 2) and v2 = (2, 4) are dependent because v2 = 2 × v1.

3. Matrix Rank is Less than the Number of Vectors (Rank Method)

Dependent Vectors: If you create a matrix using the given vectors as columns (or rows) and the rank of this matrix is less than the number of vectors, the vectors are linearly dependent.

Independent Vectors: If the rank of the matrix is equal to the number of vectors, the vectors are linearly independent.

Example: For vectors v1 = (1, 2) and v2 = (2, 4), the matrix formed by placing these vectors as columns is:

A = 
[1  2] 
[2  4]

The rank of the matrix is 1 (less than the number of vectors, which is 2), indicating the vectors are dependent.

4. Determinant of the Matrix (For Square Matrices)

Dependent Vectors: If you form a square matrix from the vectors and the determinant of that matrix is zero, the vectors are linearly dependent.

Independent Vectors: If the determinant of the matrix is non-zero, the vectors are linearly independent.

Example: For vectors v1 = (1, 2) and v2 = (2, 4), the determinant of the matrix:

A = 
[1  2] 
[2  4]

is det(A) = (1 × 4) - (2 × 2) = 0, indicating the vectors are dependent.

5. Geometrical Interpretation

Independent Vectors: If the vectors span a space that has the same dimension as the number of vectors, they are linearly independent. In a 2D plane, two independent vectors will form a non-zero area of the parallelogram they define.

Dependent Vectors: If the vectors lie along the same line or in the same plane (when in 3D), they are dependent. In this case, no new dimensions are added by the vectors.

Example: Two vectors in 2D that are not parallel (not scalar multiples) are independent. Two vectors in 3D that lie on the same plane or line are dependent.

👉Subspaces and Their Axioms

A subspace is a subset of a vector space that is itself a vector space, with the same addition and scalar multiplication operations. For a subset WV to be a subspace of the vector space V, it must satisfy the following conditions:

Subspace Axioms

  1. Non-emptiness: W must contain the zero vector 0 from V. This ensures that W is non-empty and includes an additive identity.
  2. Closure under Addition: For any vectors u,vW, their sum u+v must also be in W. This ensures that vector addition remains within the subset.
  3. Closure under Scalar Multiplication: For any vector vW and any scalar c from the field F (associated with V), the product cv must also be in W. This ensures that scalar multiplication keeps the vector in the subset.

Why These Axioms?

If a subset W meets these three conditions, it inherits all the other properties of a vector space automatically. These three conditions make sure that W is closed under the vector space operations, which allows it to be a valid subspace.

Example of a Subspace

In R3, the space of all 3-dimensional vectors, the set of all vectors lying on a plane through the origin is a subspace of R3. This set meets all three subspace conditions: it includes the zero vector, is closed under addition, and is closed under scalar multiplication.

2D Vector Space with a Subspace

In this 2D vector space, the red line through the origin represents a 1D subspace.

]

Comments