Spanning Sets and Linear Independence
Given a general vector space V over the field π½. Given that we have two operations, vector addition and scalar multiplication, at our disposal the most general algebraic expression is of the form
where v1, ..., vk β V and Ξ±1, ..., Ξ±k β π½ are k vector and scalars, respectively. An expression of the form (1.3.1) is called a linear combination of the vectors v1, ..., vk. The set of all linear combinations of given vectors v1, ..., vk is called the span of these vectors, and it is written as
Proposition 1.3.1 For any vectors v1, ..., vk β V, the span, Span(v1, ..., vk), is a vector subspace of V. Proof All we need to do is to verify that the span satisfies the conditions in Def. 6.2. Consider two vectors u, w β Span(v1, ..., vk) in the span. By definition of the span, this means they can be written as linear combinations
for suitable scalars Ξ±i, Ξ²i β π½. Their sum and the scalar multiple of u with Ξ± β π½ are then given by
and are, hence, both contained in the span. This shows that the two conditions of Definition 1.5.1 are indeed satisfied. β‘
This result means that the span provides us with a way of generating vector subspaces. The span has a straightforward geometric interpretation, at least for coordinate vectors with real entries. The span of a single vector v β βn consists of all scalar multiples of this vector and, hence, can be thought of as the line through 0 which contains v. The span of two vectors u, v β βn (which are not multiples of each other) represents the plane through 0 which contains both vectors. More generally, spans of column vectors are lines, planes and their higher-dimensional analogues through the βoriginβ 0. We will be more precise about this later but for now just present an example.
Definition 1.3.3 If every element of a vector space V over a field π½ is a linear combination of v1, ..., vn we say that the set {v1, ..., vn} spans V over F (or generates V). β‘
Note that V may be infinite or finite, but any given linear combination of vectors from S involves only a finite number of vectors.
Example 1.3.4 Let A the subset of β3 given by A = {(1,0,0), (0,1,0)}. The span of A is the xy-plane. β‘
Example 1.3.5 Let P the vector space of polynomials of any degree. Then no finite set of polynomials spans P. If you suppose that p1, p2, ..., pn are polynomials. Let pk be the polynomial of largest degree in this set and let n = deg pk. Then it is clear that the polynomial p(x) = xn+1 cannot be written as a linear combination of p1, p2, ..., pn. β‘
Linear Independence
Definition 1.3.4 Let V be a vector space. A nonempty set S of vectors in V is linearly independent if for any distinct vectors v1, v2, ..., vn in S, and scalars λi
λ1v1 + λ2v2 + ... + λnvn = 0 β λi = 0 ∀i
That is, S is linearly independent if the only linear combination of vectors from S that is equal to 0 is the trivial linear combination, all of whose coefficients are 0. If S is not linearly independent, it is said to be linearly dependent.
It is clear that a linearly independent set of vectors cannot contain the zero vector, since then 1 β 0 = 0 violates the condition of linear independence.
Let V be a vector space over a field F. The preceding examples show that linear independence and spanning do not imply each other; a subset of V may have one, both, or neither of these properties. A subset that has both properties is given a special name
Definition 1.3.6 A subset {v1, ..., vn} of a vector space V over a field F is said to be a basis of V If it spans V and is linearly Independent over F. β‘
When the vectors u and v are linearly independent, it means that they are not scalar multiples of each other: i.e. they have different directions. Arbitrary linear independent vectors u and v in β2 can be illustrated as shown in Fig. 1:
Linearly independent vectors u and v have different directions.