• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/53

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

53 Cards in this Set

  • Front
  • Back

basis

a set of vectors S = {v1, v2,.....vn} in a vector space V is a BASIS for V when:


1. S spans V


2. S is linearly independent




(has enough vectors to span V, but not so many that it becomes linearly dependent/one could be written as a combination of the others)

Does every vector space have a basis consisting of a finite number of vectors?

no, but our book will only discuss these

finite dimensional

vector space V has a basis with a finite number of vectors

example of infinite dimensional vector space V

vector space P of all polynomials

How would you show that {(1,0,0),(0,1,0),(0,0,1)} is a basis for R3?

show that it is a spanning set for R3


show that it is linearly independent

standard basis for R3

{(1,0,0),(0,1,0),(0,0,1)}

Is a basis a special subset of spanning sets?

yes. All bases are spanning sets, but not all spanning sets are bases

if a coefficient matrix has a nonzero determinant, what does this mean about the system's solution?

it has a unique solution




you can use this to prove a set is a spanning set

How do you show that a set is a spanning set?

use vector x as a general vector that can represent any vector. write the equation:



1. c1v1 + c2v2 = x


2. write as a system of linear equations


3. show that the coefficient matrix has a nonzero determinant, which means that it has a unique solution (basically means that these vectors can be used to make up any vector)

a few different ways to show that something is linearly independent

take the general system of equations with the general vector, and set it equal to zero so the system is homogenous. solve and show that only the trivial solution exists




OR show that the vectors are NOT scalar multiples of each other

uniqueness of basis representation

if S = {v1, v2, ..... vn} is a basis for a vector space V, then every vector in V can be written in one and only one way as a linear combination of S

if the determinant is nonzero, the system is ______ and _______ it has _________

consistent and invertible and it has a unique solution

How do you prove uniqueness of basis representation?

show that c1v1 + c2v2 + c3v3 = (u1, u2, u3), where v represents a basis and u represents coordinates of any vector




From the system of linear equations, show that the determinant is nonzero in order to prove it is invertible/has a unique solution

bases and linear dependence

if S = {v1, v2, v3,.....vn} is a basis for vector space V, then every set containing more than n vectors in V is linearly dependent

number of vectors in a basis

if a vector space V has one basis with n vectors, then every basis for V has n vectors

definition of the dimension of a vector space

if a vector space V has a basis consisting of n vectors, then the number n is the dimension of V, denoted by dim(V) = n

When V consists of only the zero vector, how to do define the dimension of V?

zero

Dimension of R^n with standard operations

n

Dimension of Pn with standard operations

n+1

Dimension of Mm,n with the standard operations

m * n

technique for finding the dimension of a subspace

find a set of linearly independent vectors that spans the subspace. The number of vectors in this basis gives you the dimension

every 2x2 symmetric matrix has the form....

a{1,0,0,0} + b{0,1,1,0} + c{0,0,0,1}

2 different ways to test basis in n-Dimensional space

Let V be a vetor space of dimension n


1. If S = {v1, v2, ....vn} is a linearly independent set of vectors in V, then S is a basis for V


2. If S = {v1, v2,....vn} spans V, then S is a basis for V




Of note: dimension of V MUST be n in order to only prove one of these. if dim(V) is NOT n, you must show that both are true

row space of a matrix

row space of A is the subspace of R^n spanned by the row vectors of A

column space of a matrix

column space of A is the subspace of R^m spanned by the column vectors of A

If a matrix is row-equivalent to another matrix with the same dimensions, are their row spaces the same?

yes. they have the same row space

Do elementary row operations change the row space of a matrix? Do elementary row operations change the column space of a matrix?

row space does not change; column space does

Basis for the row space of a matrix:




If a matrix A is row-equivalent to matrix B in row-echelon form, what are the implications for basis of a row space?

the nonzero row vectors of B form a basis for A

How do you find the basis for a row space?

1. Write each vector as a row


2. Put in row-echelon form (doesn't have to be Gauss-Jordan, just Gauss)


3. The nonzero row vectors of the final matrix for the basis of the original matrix

How do you find the basis for a subspace spanned by S = {v1, v2, v3.....} in Rn?

1. Write each vector as a row


2. Put in row-echelon form


3. Non-zero row vectors are the basis

What are the two ways to find the basis for the column space of a matrix?

Option 1: Use the fact that the column space is equal to A^T and apply the techniques for row space to A^T




Option 2: Recognize that although the column spaces change, the dependency relationship between columns does not

How do you find the basis for the column space using dependency relationships between columns?

1. Reduce to row-echelon form


2. Notice the columns with leading 1's. These are linearly dependent.


3. Those corresponding column numbers in the original matrix form the basis

Do the row space and column space of a matrix have the same dimension?

yes

rank of a matrix

dimension of the row (or column) space of a matrix A.




Denoted by rank(A)

How do you find the rank of a matrix?

find a basis for row space or column space. Number of vectors in the basis is the rank




(convert to row-echelon. find nonzero rows)

nullspace of A

If A is an m x n matrix, then the set of all solutions of the homogenous system of linear equations Ax = 0 is a subspace of R^n called the nullspace of A




Also called the solution space of Ax = 0

nullity of A

dimension of the nullspace of A

How do you find the nullspace of a matrix?

1. Put the coefficient matrix in reduced row echelon form


2. Write the parametric equations


3. Write as vectors


4. Factor out coefficients. The remaining vectors form the basis for the nullspace

Once a matrix is in reduced row echelon form, which columns determine the rank of a matrix? Which ones determine its nullity?

rank: columns with leading ones


nullity: columns that correspond to free variables

dimension of the solution space

If A is an m x n matrix of rank r, then the dimension of the solution space of Ax = 0 is n - r. That is, dimension n = rank(A) + nullity(A)

Why isn't the set of all solution vectors of the non homogenous system Ax = b ever a subspace?

does not contain the zero vector, so it can't be a subspace

solutions of a non homogenous linear system

if xp is a particular solution of the non homogenous set Ax = b, then every solution of this system can be written in the form x = xp + xh, where xh is a solution of the corresponding homogenous system Ax = 0

How do you find the solution set of a non homogenous system?

1. Put matrix in reduced row echelon form


2. Write parametric representation


3. Write as vectors and factor out coefficients


4. set those final vectors equal to xp + su1 + tu2 +...., where xh = su1 + tu2 represents an arbitrary vector in the solution space of Ax = 0

How can the column space of a matrix be used to determine whether a system of linear equations is consistent?

The system Ax = b is consistent if and only if b is in the column space of A. That is, the system is consistent if and only if b is in the subspace of R^n spanned by the columns of A

If you're finding the basis for the nullspace of a matrix and the 0 vector is the remaining column, what's the answer?

there is no basis for the matrix.

Coordinate representation relative to a basis / coordinate matrix

Let B {v1, v2....vn} be an ordered basis for vector space V and let x be a vector in V such that:


x = c1v1 + c2v2 + ... + cnvn




The scalars are the coordinates of x relative to B. The coordinate matrix/coordinate vector of x relative to B is the matrix in R^n whose components are those coordinates.

How do you write the answer for finding the coordinate matrix in R^3 relative to the standard basis? What about relative to a nonstandard basis?

[x]s =[enter coordinates as a column]


[x]B =[enter coordinates as a column]


//coordinates are the scalars representing a linear combination

change of basis

you're give the coordinates relative to one basis and asked to find relative to another. Basically, multiply it out to get the actual value of the original vector, and then solve for it as a linear combination of the vectors in the new basis

inverse of a transition matrix

If P is the transition matrix from base B' to base B in R^n, then P is invertible and the transition matrix from B to B' is P^-1

simpler way of explaining what the transition matrix is

Once you've multiple out the scalars to form a linear combination and first write the matrix to solve for the scalars from the system of equations, that matrix is the transition matrix

How do you find a transition matrix from B to B'?

1. Write 2 matrices from vector sets, where each individual vector is written as a COLUMN


2. Form the matrix [ B' B]


3. Use Gauss-Jordan elimination to rewrite as [I P^-1]


4. P^-1 is the transition matrix from B to B'

Simple way to find transition matrix from standard basis to nonstandard basis

Transition matrix from B to B' where B is a standard basis and B' is a nonstandard basis:




P^-1 = (B')^-1 (inverse of nonstandard basis)

Simple way to find transition matrix from nonstandard basis to standard basis

Transition matrix from B to B' where B is a nonstandard basis and B' is a standard basis:




P^-1 = B (the nonstandard basis)