Study your flashcards anywhere!
Download the official Cram app for free >
 Shuffle
Toggle OnToggle Off
 Alphabetize
Toggle OnToggle Off
 Front First
Toggle OnToggle Off
 Both Sides
Toggle OnToggle Off
 Read
Toggle OnToggle Off
How to study your flashcards.
Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key
Up/Down arrow keys: Flip the card between the front and back.down keyup key
H key: Show hint (3rd side).h key
A key: Read text to speech.a key
78 Cards in this Set
 Front
 Back
 3rd side (hint)
SYSTEM OF LINEAR EQUATIONS

A collection of linear equations with the same variables.



LINEAR EQUATION

An equation that can be written as c1x1 + c2x2 + ... + cnxn = b with c1cn being coefficients that are real numbers.



SOLUTION TO A SYSTEM OF LINEAR EQUATIONS

A collection of values that make the system true.



INCONSISTENT SYSTEM

A system with no solution.



CONSISTENT SYSTEM

A system with at least one solution.



EQUIVALENT SYSTEMS

Two linear systems that have the same solution set.



How many solutions can a system have?

The only choices are :
1. One 2. Zero 3. Infinitely many 


ROW ECHELON FORM

The form a matrix is in if:
1. All nonzero rows are above zero rows 2. Each leading entry is tot he right of the row above 3. All entires in the column below a leading entry are zeros NOTE: every matrix has an echelon form, but it is not unique. 


REDUCED ECHELON FORM

The form a matrix is in if:
1. All nonzero rows are above zero rows. 2. Each leading entry is to the right of the row above 3. All entires in the column below a leading entry are zeros. 4. The leading entry in each row is 1. 5. Each leading 1 is the only nonzero entry in its column NOTE: every matrix has a unique reduced echelon form. 


MATRIX

A rectangular array where information about a linear system can be recorded.



COEFFICIENT MATRIX

A matrix with only the coefficients of the variables aligned in columns.



AUGMENTED MATRIX

A matrix that includes coefficients from the left side of the linear system and the constants from the right side of the linear system.



SIZE

Denotes how many rows and columns a matrix has. The size is read "m by n", where m represents the number of rows and n represents the number of columns a matrix has.



ELEMENTARY ROW OPERATIONS

1. REPLACEMENT: Replace one row by the sum of itself and a multiple of another row
2. INTERCHANGE: Switch two rows 3. SCALING: Multiply all entries in a row by a nonzero constant NOTE: Row operations are reversible. 


ROW EQUIVALENT

Two matrices are row equivalent if there is a sequence of elementary row operations that transforms one matrix into the other.



EXISTENCE & UNIQUENESS

1. Is the solutions consistent; that is, does at least one solution exist?
2. If a solution exists, is it the only one; that is, is the solution unique? 


LEADING ENTRY

The leftmost nonzero entry (in a nonzero row).



PIVOT POSITION

A pivot position in matrix A is a location in A that corresponds to a leading 1 in the reduced echelon form of A.



PIVOT COLUMN

A pivot column is a column of A that contains a pivot position.



BASIC/LEADING/PIVOT VARIABLES

Variables in A that correspond to pivot columns. All other variables are FREE VARIABLES.



GENERAL SOLUTION OF A LINEAR SYSTEM

A description of variables in terms of free variables and constants, so that the solution gives an explicit description of all solutions.



COLUMN VECTOR

A matrix with only with only one column.



EQUALITY OF VECTORS IN R2

Two vectors in R2 are equal iff their corresponding entries are equal.



SCALAR MULTIPLE OF A VECTOR

The scalar multiple of vector u by constant c is the vector cu obtained by multiplying each entry in u by c .



LINEAR COMBINATION

Give vectors v1, v2, ... , vp with weights c1, ... , cp, the linear combination is described as y = c1v1 + c2v2 + ... + cpvp.



SPAN

If v1, ... , vp are in Rn, then the set of all inear combinations of v1, ... , vp is denoted by Span {v1, ..., vp} and is called the subset of Rn spanned (or generated) by v1, ... , vp. That is, Span {v1, ... , vp} is the collection of all vectors that can be written in the form c1v1 + c2v2 + ... + cpvp with c1, ... , cp scalars.



Column Vector

A matrix with only one column, also called a vector.



Vector Equality

Two vectors in R2 are equal iff their corresponding entries are equal.



Sum of Vectors

Given two vectors u and v in R2, their sum is the vector u + v obtained by adding corresponding entries of u and v.



Scalar Multiple

Given a vector u and a real number c, the scalar multiple of u by c is the vector cu obtained by multiplying each entry in u by c. The number c in cu is called a scalar.



Parallelogram Rule for Addition

If u and vin R2 are represented as points in the plane, then u + v corresponds to the fourth vertex of the parallelogram whose other vertices are u, 0, and v.



Zero Vector

The vector whose entries are all zero, denoted by a boldface 0.



Linear Combination

Given vectors v1, v2, ... , vp in Rn and given scalars c1, c2, ..., cp the vecctor y defined by y = c1v1 + ... + cpvp is called a linear combination of v1, ... , vp with weights c1, ..., cp.



Solutions to Vector Equations

A vector equation x1a1 + x2a2 + ... + xnan = b has the same solution set as the linear system whose augmented matrix is [a1 a2 ... an b]. b can be generated by a linear combination of a1, ..., an iff there exists a solution to the linear system corresponding to the augmented matrix.



Span

If v1, ..., vp are in Rn, then the set of all linear combinations of vi, ..., vp is denoted by Span{v1, ..., vp} and is called the subset of Rn spanned (or generated by v1, ..., vp. That is, Span{v1, ..., vp} is the collection of all vectors that can be written in the form c1v1 + c2v2 + ... + cpvp with c1, ..., cp scalars.



BASIS VECTORS

(1,0,0), (0,1,0), (0,0,1). The span of these basis vectors generates R3



LINEAR COMBINATION

If A is an m x n matrix with columns a1, …, an and xεRn, then Ax is the linear combination of the columns of A using the entries of x as the coefficients



MATRIX MULTIPLICATION REQUIREMENTS

In order to multiply two matrices, the number of columns of the first matrix must be equal to the number of rows in the second matrix, i.e. a 3 x 2 ● 2 x 1 will yield a 3 x 1 matrix product



EQUALITY OF SYSTEM NOTATION

Linear systems, vector equations, and matrix equations are all interchangeable



Ax=b has a solution iff…

b in the span of the columns of A



Let A be an m x n matrix. The following are equal:

1. For every bεRm, the equation Ax=b has a solution
2. Each bεRm is a linear combination of the columns of A 3. The columns of A span Rm 4. The coefficient matrix of A has a pivot in every row. If not, then the matrix is inconsistent 
None


If A is an m x n matrix; u,vεRn, and c is a scalar, then:

1. A(u + v) = Au + Av, 2. A(cu) = c(Au)



The homogeneous equation Ax=0 has a nontrivial solution iff…

the equation has at least one free variable.



PARAMETRIC VECTOR EQUATION OF THE PLANE

x = su + tv with s,tεR



PARAMETRIC VECTOR FORM

Whenever a solution set is described explicitly with vectors, x = p + tv describes the solution set of Ax = b in parametric vector form



x = p + tv describes...

the equation of the line through p parallel to v



Suppose the equation Ax = b is consistent for some given b, and let p be a solution. What can we say about the solution set in relation to the solution set of Ax = 0?

The solution set of Ax = b is the set of all vectors of the form w = p + v(h), where v(h) is any solution of the homogeneous equation Ax = 0.



Steps for writing a solution set (of a consisten system in parametric vector form:

1. Row reduce the augmented matrix to reduced echelon form. 2. Express each basic variable in terms of any free variables appearing in an equation. 3. Write a typical solution x as a vector whose entries depend on the free variables, if any. 4. Decompose x into a linear combination of vectors (with number entries) using the free variables as parameters.



LINEAR INDEPENDENCE

An indexed set of vectors {v1, … ,vp} εRn is said to be linearly independent if the vector equation x1v1 + … +xpvp = 0 has only the trivial solution.



LINEAR DEPENDENCE

An indexed set of vectors {v1, … ,vp} εRn is said to be linearly dependent if there exist weights c1, … ,cp not all zero such that quation c1v1 + … +cpvp = 0.



The columns of matrix A are linearly independent iff…

the equation Ax = 0 has ONLY the trivial solution.



A set containing only one vector v in inearly independent iff…

v is not the zero vector.



A set of two vectors {v1, v2} is linearly independent iff…

neither of the vectors is a multiple of the other.



In geometric terms, two vectors are linearly dependent iff…

they lie on the same line through the origin.



An indexed set S = {v1, …, vp} of two or more vectors is linearly dependent iff…

AT LEAST one of the vectors in S is a linear combination of the others.



What is a warning regarding linearly dependent sets?

Not EVERY vector in a linearly dependent set is a linear combination of the preceding vectors. A vector in a linearly dependent set may fail to be a linear combination fof the other vectors.



If a set contains more vectors than there are entries in each vector, then the set is…

linearly dependent. Another way to say this is any set {v1, …, vp}εRn is linearly dependent if p > n.



If a set S= {v1, …, vp}εRn contains the zero vector, then the set is…

linearly dependent. Another way to say this is any set {v1, …, vp}εRn is linearly dependent if p > n.



TRANSFORMATION/FUNCTION/MAPPING

Defined as T:Rn→Rm that assigns to each vecto xεRn a vector T(x)εRm



DOMAIN OF T

Rn



CODOMAIN OF T

Rm



IMAGE

For xεRn, the vector T(x)εRm is the IMAGE of x under the action of T



RANGE

The RANGE of T is the set of all images T(x), or the set of all linear combinations of the columns of A



A transformation T is linear iff:

1. T(u + v) = T(u) + T(v) for all u, v in the domain of T 2. T(cu) = cT(u) for all u and all scalars c 3. T(0) = 0



Every matrix transformation is also…

a linear transformation



TRANSFORMATION/FUNCTION/MAPPING

Defined as T:Rn→Rm that assigns to each vecto xεRn a vector T(x)εRm



DOMAIN OF T

Rn



CODOMAIN OF T

Rm



IMAGE

For xεRn, the vector T(x)εRm is the IMAGE of x under the action of T



RANGE

The RANGE of T is the set of all images T(x), or the set of all linear combinations of the columns of A



A transformation T is linear iff:

1. T(u + v) = T(u) + T(v) for all u, v in the domain of T 2. T(cu) = cT(u) for all u and all scalars c 3. T(0) = 0



Every matrix transformation is also…

a linear transformation



STANDARD MATRIX FOR A LINEAR TRANSFORMATION T

Let T: Rn→Rm be a linear transformation. Then there exists a unique matrix A such that T(x) = Ax for all xεRn. A is the standard matrix, or the m x n matrix whose jth column is the vector T(ej), where ej is the jth column of the identity matrix ε Rn. A = [T(e1) ... T(en)]



What is the conceptual difference between a linear transformation and a matrix transformation?

The term linear transformation focuses on a property of a mapping, while matrix transformation describes how such a mapping is implemented



ONTO

A mapping T:Rn→Rm is said to be onto Rm if each bεRm is the image of at least one xεRn, or equivalently, when the range of T is all of the codomain Rm. "Does T map Rn onto Rm?" is an existence question!



ONETOONE

A mapping T:Rn→Rm is said to be onetoone if each bεRm is the image of at most one xεRn, or equivalently, T is onetoone if for each bεRm, the quation T(x) = b has either a unique solution or none at all. "Is T onetoone" is a uniqueness question!



Let T: Rn→Rm be a linear transformation. Then T is onetoone iff…

the equation T(x) = 0 has only the trivial solution.



Let T: Rn→Rm be a linear transformation and let A be the standard matrix for T. Then:

1. T maps Rn onto Rm iff the columns of A span Rm 2. T is onetoone iff the columns of A are linearly independent

