• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/81

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

81 Cards in this Set

  • Front
  • Back

Gaussian Elimination

When you solve a system like a normal person. The resulting matrix is in echelon form.

echelon form

A matrix has this form if...


1. Every leading term is in a column to the left of the leading term of the row below it.


2. Any zero rows are at the bottom of the matrix

reduced row echelon form

A matrix has this form if...


1. It is in echelon form


2. All pivot positions contain a 1


3. The only nonzero term in a pivot column is in the pivot position.

Gauss-Jordan Elimination

If you want to do extra work for no reason. The resulting matrix is in reduced row echelon form.

pivot elements

The circled elements

The circled elements

trivial solution

The solution where everything equals 0 and then everyone parties

consistent system

If the system has 1 solution or infinite solutions

inconsistent system

If the system has no solutions

system of linear equations

if you don't know what this is, you're doomed

Matrix operations

You are allowed to...


1. Interchange 2 equations


2. Multiply equations by nonzero constant


3. Add a multiple of one equation to another




I mean, duh...

linearly independent

If the only solution to the system is the trivial solution

linearly dependent

If there is a nontrivial solution

homogeneous equation

If the equation equals 0 (there aren't constants)

homogeneous system

If all the equations in the system equal 0

row vector

a vector that looks like that

a vector that looks like that

column vector

a vector that looks like that

a vector that looks like that

algebraic properties of vectors

literally learned these in every math/physics class since high school

linear combination

yeah man

yeah man

span

The set of all possible linear combinations of x1u1 + x2u2 + . . . + xmum in R^n




written as: span{u1, u2, ... , um}

How do you tell if a given vector is in the span of vectors?

Look at the augmented matrix. The system has a solution iff the given vector is in the span.

Theorems about span

1. If u1, u2, ... un are vectors in R^n and b is in span{u1, ... un} then span{u1...um} = span{b, u1,...um}




2. If u1, ... um are vectors in R^n and m < n, then u1,...um do not span R^n

nxm matrix

Has n rows and m columns

Does the matrix span R^n?

Does the matrix span R^3?

No. Since row of 0's, there exists a vector not in the span of the column vectors, so the columns of the matrix do not span R^3

No. Since row of 0's, there exists a vector not in the span of the column vectors, so the columns of the matrix do not span R^3

Theorems about linear independence/dependence

1. Suppose {0, u1, ... um} is a set of vectors in R^n. Then this set is linearly dependent.




2. Suppose {u1, u2, ... um} is a set of vectors in R^n. If n < m, then the set of vectors is linearly dependent.




3. Let {u1, ... um} be a set of vectors in R^n. The set is linearly dependent iff one of the vectors is in the span of the other vectors

particular solution

a fixed solution to the system

homogeneous solution

the solution to the homogeneous system

Theorem about vectors...

Let a1, a2, ... am and b be vectors in R^n. Then the following statements are equivalent.


1. the set {a1, ... am} is linearly independent


2. the vector equation x1a1 + ... + xmam =b has at most 1 solution.


3. The linear system corresponding to [a1, a2, ... am, b] has at most 1 solution


4. The equation Ax = b with A = [a1, a2, ... am] has at most 1 solution

Big Theorem

Let a be a set of vectors in R^n and let A = [a1, ... an] and let T: R^n -> R^n be given by T(x) = Ax Then the following are equivalent....


a.) a spans R^n


b.) a is linearly independent


c.) Ax = b has a unique solution for all b in R^n


d.) T is one-to-one


e.) T is onto


f.) matrix A is invertible


g.) Ker(T) = {0}


h.) A is a basis for R^n



linear transformation

A function T: R^m -> R^n is a linear transformation if for all vectors u and v in R^n and all scalars r,


(a) T(u + v) = T(u) + T(v)


(b) T(ru) = rT(u)




or combine (a) and (b) to get:


T(ru + sv) = rT(u) + sT(v)

domain

is R^m

is R^m

codomain

is R^n

is R^n

image of u under T

is T(u)

range of T

The set of all image vectors. Denoted range(T)




All y such that T(x) = y where T: R^m -> R^n

linear transformation theorems

1. Let A be an nxm matrix and define T(x) = Ax. Then T: R^m -> R^n is a linear transformation.




2. Let A = [a1 ... am] be an nxm matix and let T:R^m -> R^n with T(x) = Ax be a linear trans.


(a.) The vector w is in range(T) iff Ax = w is a consistent linear system


(b.) range(T) = span{a1, ..., am}

one-to-one

If T:R^m -> R^n is a linear transformation, T is one-to-one if for every vector w in R^n, there exists at most one vector u in R^n such that


T(u) = w.




Also, if T(u) = T(v) then u = v

onto

If T: R^m -> R^n is a linear transformation, T is onto if for every vector w in R^n there exists at least one vector u in R^m such that T(u) = w.

Theorem about one-to-one

Let A be an nxm matrix and define T: R^m -> R^n by T(x) = Ax. Then...




(a.) T is 1-1 iff the columns of A are linearly independent




(b.) If n < m then T is not one-to-one

Theorem about onto

Let A be an nxm matrix and define T: R^m -> R^n by T(x) = Ax. Then...




(a.) T is onto iff the columns of A span the codomain R^n




(b.) If n > m, then T is not onto

How could a transformation be both one-to-one and onto?

The number of columns and the number of rows in the matrix must equal each other. Square matrices for life.

projection transformation

Something that's like that

Something that's like that

matrix multiplication

(mxn)(nxp) = mxp

Samantha

a genius

Identity matrix

durdurdur

durdurdur

Matrix properties

A(BC) = (AB)C


A(B + C) = AB + BC


(A + B)C = AC + BC


s(AB) = (sA)B or (A)(sB)


AI = A


IA = A

Theorem with non-zero matrices

Let A, B, and C be non-zero matrices.




(a.) It is possible that AB doesn't equal BA




(b.) AB = 0 doesn't imply A = 0 or B = 0




(c.) AC = BC doesn't imply that A = B or C = 0

Theorem with matrix transforms

A and B are both nxm matrices and C is an mxk matrix and s is a scalar




(a.) (A+B)^T = A^T + B^T




(b.) (sA)^T = s(A^T)




(c.) (AC)^T = (C^T)(A^T)

transform of matrix

make the rows the columns

diagonal matrix

diagonal with rest zeros

diagonal with rest zeros

upper triangular matrix

numbers in upper right make a triangle

numbers in upper right make a triangle

lower triangular matrix

numbers in lower left make a triangle

numbers in lower left make a triangle

invertible transformation

If T: R^m -> R^n is one-to-one and onto

inverse function for transformation

yep

yep

inverse/linear transformation theorem

Let T: R^m -> R^n be a linear transformation. Then...




(a.) T has an inverse only if m = n




(b.) If T is invertible, then T^-1 is also a linear transformation

invertible matrix

An nxm matrix A is invertible if there exists an nxm matrix B such that AB = In

invertible matrix (obvious) theroem

Suppose A is invertible with AB = In. Then BA = In and the matrix B is such that AB = BA = In and B is unique.

inverse of matrix

If the nxm matrix A is invertible, then A^-1 is called the inverse of A and is the unique matrix such that A(A^-1) = In and (A^-1)A = In

nonsingular

a matrix that is invertible

singular

a matrix that is not invertible

Theorem about lots of invertible matrix stuff

Let A and B be invertible nxn matrices and C and D be nxm matrices. Then...




(a.) AB is also invertible, (AB)^-1 = (B^-1)(A^-1)


(b.) A^-1 is invertible with (A^-1)^-1 = A


(c.) If AC = AD, then C = D


(d.) If AC = 0nxm then C = 0nxm

How to find the inverse of a matrix

There's some "shortcut" for a 2x2 matrix, but that's just another poop formula to memorize and no one wants that.

There's some "shortcut" for a 2x2 matrix, but that's just another poop formula to memorize and no one wants that.

Elementary Matrices

E3 * E2 * E1 * A = u. u is A in REF. E's are identity matrices transformed by the operations used to turn A into u. (Example in notes from 2/12/16)

subspace

A subset S of R^n is a subspace if...




(a.) S contains 0 (the 0 vector)




(b.) If u and v are in S, then u + v is also in S




(c.) For r, a real number and u in S, ru is also in S

subspace theorems

1. Let S = span{u1, u2, ... um} be a subset of R^n Then S is a subspace of R^n.




2. If A is an nxm matrix, then the set of solutions to the homogeneous linear system Ax = 0 forms a subspace of R^m

trivial subspace

S = {0} is a subspace of R^n.


S = R^n is a subspace of R^n.




Those 2 are the trivial subspaces of R^n

null space

If A is an nxm matrix, then the set of solutions to Ax = 0 is called the null space of A and is denoted by null(A)




still don't really understand how you write this out... she's so dang confusing (and doesn't know what she's doing,) but whatever

kernel of T

The set of vectors x such that T(x) = 0.




denoted ker(T)

kernel/range theorem

Let T: R^m -> R^n be a linear transformation. Then... the kernel of T is a subspace of the domain (R^m) and the range of T is a subspace of the codomain (R^n).

general theorem... (I don't know... It actually has a special name unlike all the other theorems in my notes...)

Let T:R^m -> R^n be a linear transformation.




Then... T is 1-1 iff ker(T) = {0}

basis

A set B = {u1, ... um} is a basis for a subspace if




(a.) B spans S



(b.) B is linearly independent

basis theorems

Let B = {u1, ... um} be a basis for a subspace S. Then every vector s in S can be written as a linear combination s = s1u1 + ... + snun in exactly one way.

subspace of rows theorem

Let A and B be equivalent matrices (meaning one is transformed to the other by row operations.) Then the subspace spanned by the rows of A is the same as the subspace spanned by the rows of B.

dimension theorem

If S is a subspace of R^n, then every basis of S has the same number of vectors.

dimension

the dimension of S is the number of vectors in a any basis of S.




denoted dim(S)

equivalent matrices

2 matrices are equivalent if one is transformed to the other by row operations

theorem about equivalent matrices

Suppose U = [u1, ... um] and V = [v1, ... vm] are 2 equivalent matrices. Then any linear dependence that exists among vectors u1 ... um also exists among vectors v1, ..., vm.




yo, this is kinda common sense.

theorem about forming basis for a subspace

Let set U= {u1,...,um} be a set of vectors in a subspace S doesn't equal {0} or R^n.


(a.) If U is linearly independent, then either U is a basis for S or additional vectors can be added to U to form a basis


(b.) If U spans S, then either U is a basis for S or vectors can be removed from U to form a basis

theorem about vectors forming basis that's pretty much the same thing as the other theorem about it

Let U = {u1,...,un} be a set of vectors in subspace S of dim(S) = m. If U is either linear independent or spans S, then U is a basis for S

subspaces/dimension theorem

Suppose S1 and S2 are both subspaces of R^n and S1 is a subset of S2. Then...




dim(S1) <= dim(S2)




and dim(S1) = dim(S2) only if S1 = S2

theorem about subspaces and dimension again

Let U = {u1, ... um} be a set of vectors in a subspace S of dimension k.



(a.) If m < k, then U doesn't span S




(b.) If m > k, then U is linearly dependent

standard basis

{e1, e2,...,en} form a basis for R^n and are called the standard basis

{e1, e2,...,en} form a basis for R^n and are called the standard basis

nullity

the nullity of a matrix A is the dimension of the null space of A




denoted by nullity(A)




special case: If you get the 0 vector for the null space, then the nullity for the matrix is 0.