Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
39 Cards in this Set
- Front
- Back
vector space |
a nonempty set V of objects, called vectors, on which are defined two operations, called addition and multiplication by scalars (real numbers), subject to the ten axioms |
|
ten axioms of a vector space (must hold for all vectors u, v, and w in V for all scalars c and d) |
(1) the sum of u and v, denoted by u+v, is in V (2) u+v=v+u (3) (u+v)+w=u+(v+w) (4) There is a zero vector 0 in V such that u+0=u (5) For each u in V, there is a vector -u in V such that u+(-u)=0 (6) The scalar multiple of u by c, denoted by cu, is in V (7) c(u+v)=cu+cv (8) (c+d)u=cu+du (9) c(du)=(cd)u (10) 1u=u |
|
What is the vector -u? |
the negative of the vector u that is unique for each u in V |
|
For each vector u in V and scalar c... |
(1) 0u=0 (2) c0=0 (3) -u=(-1)u |
|
subspace of a vector space V (and its three properties) |
a subset H of V that has three properties: (1) the zero vector of V is in H (2) H is closed under vector addition (for each u and v in H, the sum u+v is in H) (3) H is closed under multiplication by scalars (for each u in H and each scalar c, the vector cu is in H) |
|
True or False: Every subspace is a vector space? |
True. Every subspace is a vector space |
|
zero subspace |
set consisting of only the zero vector in a vector space V; is a subspace; written as {0} |
|
True of False: A plane in R3 not through the origin is not subspace of R3 |
True, because it does not contain the zero vector of R3 |
|
linear combination |
refers to any sum of scalar multiples of vectors, and Span{v1,..,vp} denotes the set of all vectors that can be written as linear combinations of v1,..,vp |
|
True or False: If v1,...,vp are in a vector space V, then Span{v1,...vp} is not a subspace of V |
False. It is a subspace. |
|
What is Span{v1,...vp}? |
the subspace spanned (or generated) by {v1,...,vp}
|
|
spanning (or generating) set for H |
a set {v1,...,vp} in H such that H=Span{v1,...,vp} |
|
solution set |
the set of all x that satisfy the system of equations |
|
null space (of an mxn matrix A) |
written as Nul A, it is the set of all solutions of the homogeneous equations Ax=0
Nul A={x: x is in n and Ax=0} |
|
Is the null space of a matrix a vector space? |
Yes. The null space of an mxn matrix A is a subspace of n. Equivalently, the set of all solutions to a system Ax=0 of m homogeneous linear equations in n unknowns is a subspace of n |
|
What is the first step of finding a spanning set for the null space of matrix? |
The first step is to find the general solution of Ax=0 in terms of free variables |
|
What are two things that apply to all problems where Nul A contains nonzero vectors? |
(1) the spanning set produced is automatically linearly independent (because the free variables are the weights on the spanning vectors) (2) when Nul A contains nonzero vectors, the number of vectors in the spanning set for Nul A equals the number of free variables in the equations Ax=0 |
|
column space (of a mxn matrix A) |
written as Col A, it is the set of all linear combinations of the columns of A
If A=[a1 ... an], then Col A=Span{a1,...,an} and Col A={b: b=Ax for some x in n} |
|
True or False: column space of an mxn matrix A is a subspace of Rm |
True. The column space is a subspace |
|
True or False: Col A is the co-domain of the linear transformation x-->Ax |
False. Col A is the range of the linear transformation
The column space of mxn matric A is all of Rm if and only if the equation Ax=b has a solution for each b in Rm |
|
True or False: When a matrix is not square, the vectors in Nul A and Col A live in the same "universe" |
False. When a matrix is not square, the vectors in Nul A and Col A live in entirely different "universes" (for example, in a mxn matrix, then Col A is a subspace of Rm and Nul A is a subspace of Rn) |
|
General Information about Nul A |
(1) Nul A is a subspace of Rn (2) Nul A is implicitly defined (you are given only the condition Ax=0 that vectors in Nul A must satisfy (3) It takes time to find vectors in Nul A. Row operations on [A 0] are required. (4) There is no obvious relation between Nul A and the entries of A (5) A typical vector v in Nul A has the property that Av=0 (6) Given a specific vector v, it is easy to tell if v is in Nul A. Just compute Av (7) Nul A={0} if and only if the equation Ax=0 has only the trivial solution (8) Nul A={0} if and only if the linear transformation x-->Ax is one-to-one |
|
General Information about Col A |
(1) Col A is a subspace of Rm (2) Col A is explicitly defined (you are told how to build vectors in Col A) (3) It is easy to find vectors in Col A; the columns of A are displayed, others are formed from them (4) There is an obvious relation between Col A and the entries in A, since each column of A is in Col A (5) A typical vector v in Col A has the property that the equation Ax=v is consistent (6) Given a specific vector v, it may take time to tell if v is in Col A; row operations on [A v] are required (7) Col A=Rm if and only if the equation Ax=b has a solution for every b in Rm (8) Col A=Rm if and only if the linear transformation x-->Ax maps Rn onto Rm |
|
What is a linear transformation T (from a vector space V into a vector space W)? |
it is a rule that assigns to each vector x in V a unique vector T(x) in W, such that: (1) T(u+v)=T(u)+T(v) for all u,v in V and (2) T(cu)=cT(u) for all u in V and all scalars c |
|
kernel of linear transformation T |
the null space of T; the set of all u in V such that T(u)=0 (the zero vector in W); it is a subspace of V |
|
range of linear transformation T |
the set of all vectors in W of the form T(x) for some x in V; it is a subspace of W
if T happens to arise as a matrix transformation--say, T(x)=Ax for some matrix A--then the range of T is the just the column space of A |
|
linear independence |
an indexed set of vectors {v1,..,vp} in V that only has a trivial solution for the vector equation: c1v1+c2v2+...+cpvp=0 where c1=0,...,cp=0 |
|
linear dependence |
an indexed set of vectors {v1,..,vp} in V that has a nontrivial solution for the vector equation: c1v1+c2v2+...+cpvp=0 where c1,...,cp aren't all 0 |
|
True or False: a set containing a single vector v is linearly dependent if and only if v does not equal zero |
False. A set containing a single vector v is linearly independent if and only if v does not equal zero |
|
True or False: any set containing the zero vector is linearly dependent |
True (the zero vector is a multiple of any vector with scalar zero, and if the vectors are multiples of each other then it is linearly dependent). |
|
When is an indexed set {v1,..,vp} of two or more vectors, with v1 not equaling zero, linearly dependent? |
it's linearly dependent if and only if some vj (with j>1) is a linear combination of the preceding vectors, v1,..,vj-1 |
|
basis (for subspace H of vector space V) |
Let H be a subspace of a vector space V. An indexed set of vectors B={b1,..,bp} in V is a basis for H if: (1) B is a linearly independent set, an (2) the subspace spanned by B coincides with H, so that H=Span{b1,..,bp}
|
|
Does our definition of a basis apply to the case H=V? |
Yes, because any vector space is a subspace of itself. |
|
Suppose A is an invertible nxn matrix. Do the columns of A form a basis for Rn? |
Yes, because they are linearly independent and they span Rn, by the Invertible Matrix Theorem |
|
standard basis for Rn |
the set {e1,..,en} where e1,..,en are the columns of the nxn identity matrix, In
|
|
What is the Spanning Set Theorem? |
Let S={v1,..,vp} be a set in V, and let H=Span{v1,..,vp}. Then: (1) If one of the vectors in S (say, vk) is a linear combination of the remaining vectors in S, then the set formed from S by removing vk still spans H. (2) If H doesn't equal {0}, some subset of S is a basis of H |
|
When matrix A is row reduced to matrix B, the columns of B are often totally different from the columns of A. Do they have the same linear dependence relationships? |
Yes. Despite having different columns, matrix A and matrix B have the same solution set, and both of their columns have the exact same linear dependence relationships. |
|
True or False: The pivot columns of matrix A form a basis for Col A. |
True. Every nonpivot column of A is a linear combination of the pivot columns of A. Thus the nonpivot columns of A may be discarded from the spanning set for Col A (by the Spanning Set Theorem). This leaves the pivot columns of A as a basis for Col A.
WARNING: The pivot columns of a matrix A are evident when A has been reduced only to echelon form. Be careful to use the pivot columns of A itself for the basis of Col A. |
|
True or False: A basis is a linearly independent set that is as large as possible. |
True. The deletion of vectors from a spanning set must stop when the set becomes linearly independent. If an additional vector is deleted, it will not be a linear combination of the remaining vectors, and the smaller set will no longer span V |