Exam 2

Mapping

Theorem

Suppose T: is the linear transformation where A is an matrix then:

One-to-one

T is one-to-one if and only if the columns of A are linearly independent, which happens precisely when A has a pivot position in every column.

Onto

T is onto if and only if the span of the columns of A is , which happens precisely when A has a pivot position in every row.

Matrices

Markov Chains

Some notes on probability matrices

All probability matrices that are are given with columns summing to one have two different Eigenvectors. One will always be on the line , the other line is what we're actually solving for.

Also a probability matrix is just when the column adds to one. The span is a line, but it's only a probability matrix when the vector tip lays on . This is what we're solving for in probability matrices.

Methods with stochastic (probability) matrix

If Rows Add to 1

We haven't encountered one of these in class yet, so this most likely shouldn't be used.

This video has a good explanation of this type

Use row vector format:

vP=v\\ vP-v=0\\ v(P-1\cdot I)=0\\ \text{Subbing in an example:}\\ [x\ \ y](\begin{bmatrix} .6 & .4\\ .15 & .85 \end{bmatrix}- \begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix})=0\\ [x\ \ y](\begin{bmatrix} -.4 & .4\\ .15 & -.15 \end{bmatrix})=0\\ \text{Make equations by distributing:}\\ -.4x+.15y=0\\ .4-.15y=0\\ \text{notice one is redudant, we'll get rid of it}\\ \text{we'll also notice that the x and y's have to add to 1 so:}\\ x+y=1\\ \text{this means:}\\ x+y=1\\ .4x-.15y=0\\ \text{put into matrix and reduce:}\\ \begin{bmatrix} 1 & 1 & | & 1\\ .4 & -.15 & | & 0 \end{bmatrix}\\ \darr\\ \begin{bmatrix} 1 & 0 & | & \frac{3}{11}\\ 0 & 1 & | & \frac{8}{11} \end{bmatrix}\\ v=[\frac{3}{11}\ \ \frac{8}{11}]

If Columns Add to 1

This video has a good explanation of this type

Find the steady-state distribution vector for the regular Markov chain whose transition matrix is:

Notice with this example the columns are adding to one as opposed to the method above, which was rows.

The method given in class is better on an exam for finding the steady state, this one just skips many steps

Given in class

Notice that the final vector can be solved for by solving for then creating an . Heres an example on the last lambda:

given above

Now

This is probably the easiest way to solve for the spanning vector.

Reducing a Complex Number Matrix

Normally involves multiplying by the conjugate in the matrix to simplify:

Now simplification is easy, just .

Interpreting Markov Chains

The way states are represented can be the other way than we're doing it here, but in general for this class we follow this format:

This matrix is stating four things:

State A State A is

State A State B is

State B State A is

State B State B is

stateDiagram
    A --> A: 0.8
    A --> B: 0.2
    B --> A: 0.4
    B --> B: 0.6

Inverses

Properties

  • If A is invertible and then is a solution
  • but
  • If a matrix is invertible, so is its transpose
  • ​ is invertible if A is invertible
  • Invertible if the RREF is the identify matrix
  • You can solve an equation by doing because so

Subspaces

Properties

  • Has to be a "space," - a span is always the smallest subspace for a set of vectors.
  • Must include the zero vector
  • In general, a line or plane in is a subspace if an only if it passes through the origin
  • A line in is a subspace of , a plane in is a subspace in

Rank-Nullity

  • The number of columns is the same as the nullity + rank.