==> the vectors are linearly dependent
are linearly dependent or independent.
Thus we need to solve the system Ac=0, where A is a square matrix.

Such a system will have the trivial solution if A is invertible. That is |


Theorem
n vectors in Rn are linearly independent if and only if


Example

Determine if the vectors
Case II: k = n (number of vector = n)
Determining linear independence of vectors in Rn

Let v1, v2, ..., vk be k vectors in Rn

Case I: k > n ( number of vectors > n)

To determine linear independence we need to solve the homogenous system

c1 v1 + c2 v2 + .... + ck vk = 0

or Ac = 0

where A = [ v1 v2 v3 .... vk ] and c =

Remark: this is equivalent to solving a system with more unknowns (k) than equations (n)

So we need to compute the rref of the matrix A (or the rref of the augmented matrix [A | 0])

In doing so, the maximum number of non-zero row (the rank of A) of the reduced matrix will be n, which is less that k. Thus, we will have k - n number of free variable ==> infinite number of solutions ==> the system does not have a unique solution ==> the vectors are linearly dependent

Theorem
Any set of more than n vectors in R n are linearly dependent.


Theorem
Let v1, v2, ..., vk be k vectors in Rn . If the rank of the matrix A = [v1 v2 ....vk ] is less than k, then the vectors are linearly dependent.

Example
b) Assume that the vectors are linearly dependent

==> not all the coefficients of equation (3) are zeros. Say ==> ==>
the representation of w is not unique.

Recall the definition:

Span{v1, v2, ..., vk } = set of all linear combinations of the vectors v1, v2 , ..., vk

If w is an element in span{v1 , v2, ..., vk} then

w = a1 v1 + a2 v2 + .... + ak vk for some a1,... ,ak (1)


Question: Is this linear combination unique?

The answer depends on whether the vectors v1, v2 , ..., vk are linearly independent or linearly dependent.

a) Assume that the vectors v1 , v2, ..., vk are linearly independent

suppose that w can be written as a different linear combinations of the vectors v1 , v2, ..., vk

that is w = b1 v2 + b2 v2 + .........+ bk vk (2)

(1) and (2) ==> a1 v1 + a2 v2 + .... + ak vk = b1 v1 + b2 v2 + .........+ bk vk

or (a1 - b1) v1 + (a2 - b2) v2 + ......+ (ak - bk) vk = 0 (3)

But the assumption that the vectors v1, v2 , ..., vk are linearly independent ==> equation (3) has only the trivial solution. That is,

a1- b1 =0, a2 - b2 = 0, ak - bk = 0

==> a1 = b1, a2 = b2 , ak = bk

Conclusion: If the vectors v1 , v2, ..., vk are linearly independent then there is one and only one way to write a vector w as a linear combination of these vectors.

In other words all the vectors in span{v1, v2 , ..., vk } can be uniquely written as a linearly comb of the vectors v1, v2, ..., vk.
Recall the definition of linear independence:

The vector v1, v2, ..., vk are linearly independent if and only if the equation

c1 v1 + c2 v2 + .... + ck vk = 0 has only the trivial solution

that is c1 = c2 =.....= ck = 0
4.3 Unique representation of vectors by a spanning set.