# Properties of Matrix Arithmetic

I've given examples which illustrate how you can do arithmetic with matrices. Now I'll give precise definitions of the various matrix operations. This will allow me to prove some useful properties of these operations.

If A is a matrix, the element in the row and column will be denoted . (Sometimes I'll switch to lower-case letters and use instead of ). Thus,

is the entry of .

is the entry of .

is the entry of .

Remark. To avoid confusion, use a comma between the indices where appropriate. " " clearly means the entry in row 2, column 4. However, for the entry in row 13, column 6, write " ", not " ".

Here are the formal definitions of the matrix operations. When I write something like "for all i and j", you should take this to mean "for all i such that , where m is the number of rows, and for all j such that , where n is the number of columns".

Definition. ( Equality) Let A and B be matrices. Then if and only if A and B have the same dimensions and for all i and j.

This definition says that two matrices are equal if they have the same dimensions and corresponding entries are equal.

Definition. ( Sums and Differences) Let A and B be matrices. If A and B have the same dimensions, then the sum and the difference are defined, and their entries are given by

This definition says that if two matrices have the same dimensions, you can add or subtract them by adding or subtracting corresponding entries.

Definition. The zero matrix 0 is the matrix whose entry is given by

Proposition. Let A, B, and C be matrices, and let 0 denote the zero matrix. Then:

Proof. Each of the properties is a matrix equation. The definition of matrix equality says that I can prove that two matrices are equal by proving that their corresponding entries are equal. I'll follow this strategy in each of the proofs that follows.

(a) To prove that , I have to show that their corresponding entries are equal:

(Do you understand what this says? is the entry of , while is the entry of .)

Since this is the first proof of this kind that I've done, I'll show the justification for each step.

"Associativity" refers to associativity of addition for numbers.

Therefore, , because their corresponding elements are equal.

(b) To prove that , I have to show that their corresponding entries are equal:

"Commutativity" refers to commutativity of addition of numbers.

Therefore,

(c) To prove that , I have to show that their corresponding entries are equal:

By definition of matrix addition and the zero matrix,

"Arithmetic" refers to the fact that if x is a number, then .

Therefore, .

In part (b), I showed that addition is commutative. Therefore,

Remark. You can see that the idea in many of these proofs for matrices is to reduce the proof to a known property of numbers (such as associativity or commutativity) by looking at the entries of the matrices. Since most of these proofs are fairly simple, I won't write out all of them, and I won't do them in step-by-step detail like the ones above. You should try working through some of them yourself to ensure that you get the idea.

Definition. ( Multiplication by Numbers) If A is a matrix and k is a number, then is the matrix having the same dimensions as A, and whose entries are given by

(It's considered ugly to write a number on the right side of a matrix if you want to multiply. For the record, I'll define to be the same as .)

This definition says that to multiply a matrix by a number, multiply each entry by the number.

Definition. If A is a matrix, then is the matrix having the same dimensions as A, and whose entries are given by

Proposition. Let A and B be matrices with the same dimensions, and let k be a number. Then:

(a) and .

(b) .

(c) .

(d) .

(e) .

Note that in (b), the 0 on the left is the number 0, while the 0 on the right is the zero matrix.

Proof. I'll prove (a) by way of example and leave the proofs of the other parts to you.

First, I want to show that . I have to show that corresponding entries are equal, i.e.

I apply the definitions of matrix addition and multiplication of a matrix by a number:

Therefore, , so .

Next, I want to show that . I just repeat the last proof with "-" in place of "+".

I have to show that corresponding entries are equal, i.e.

I apply the definitions of matrix subtraction and multiplication of a matrix by a number:

Therefore, , so .

Suppose A and B are matrices with compatible dimensions for multiplication. Where does the entry of come from? It comes from multiplying the row of A with the column of B:

Corresponding elements are multiplied, and then the products are summed. In equation form, this means that the entry of the product is

Definition. ( Multiplication) Let A be an matrix and let B be an matrix. The product is the matrix whose entry is given by

It's often useful to have a symbol which you can use to compare two quantities i and j --- specifically, a symbol which equals 1 when and equals 0 when .

Definition. The Kronecker delta is defined by

Example.

Lemma. .

Proof. To see what's happening, write out the sum:

By definition, each with unequal subscripts is 0. The only that is not 0 is the one with equal subscripts. Since i is fixed, the that is not 0 is , which equals 1. Thus,

Definition. The identity matrix (or just I, if there's no risk of confusion) is the matrix whose entry is given by

Proposition.

(a) ( Associativity of Matrix Multiplication) If A, B, and C are matrices which are compatible for multiplication, then

(b) ( Distributivity of Multiplication over Addition) If A, B, C, D, E, and F are matrices compatible for addition and multiplication, then

(c) If j and k are numbers and A and B are matrices which are compatible for multiplication, then

(d) ( Identity for Multiplication) If A is an matrix, then

The "compatible for addition" and "compatible for multiplication" assumptions mean that the matrices should have dimensions which make the operations in the equations legal --- but otherwise, there are no restrictions on what the dimensions can be.

Proof. I'll prove (a) and part of (d) by way of example, and leave the proofs of the other parts to you.

Before starting, I should say that this proof is rather technical, but try to follow along as best you can. I'll use i, j, k, and l as subscripts.

Suppose that A is an matrix, B is an matrix, and C is a matrix. I want to prove that . I have to show that corresponding entries are equal, i.e.

By definition of matrix multiplication,

If you stare at those two terrible double sums for a while, you can see that they involve the same A, B, and C terms, and they involve the same summations --- but in different orders. I'm allowed to convert one into the other by interchanging the order of summation, and using the distributive law:

Therefore, , and so . Wow!

Next, I'll prove the second part of (d), namely that . As usual, I must show that corresponding entries are equal:

By definition of matrix multiplication and the identity matrix,

Using the lemma I proved on the Kronecker delta, I get

Thus, , and so .

Definition. Let A be an matrix. The transpose of A is the matrix whose entry is given by

Proposition. Let A and B be matrices of the same dimension, and let k be a number. Then:

(a) .

(b) .

(c) .

Proof. I'll prove (b) by way of example and leave the proofs of the other parts for you.

I want to show that . I have to show the corresponding entries are equal:

Now

Thus, , so .

Proposition. Suppose A and B are matrices which are compatible for multiplication. Then

Proof. I'll derive this using the matrix multiplication formula.

Let denote the entry of , and likewise for B and . Then

The product on the right is the entry of , while is the entry of . Therefore, , since their corresponding entries are equal.

Definition.

(a) A matrix X is symmetric if .

(b) A matrix X is skew symmetric if .

Remarks. Both definitions imply that X is a square matrix.

In terms of elements, X is symmetric if

X is skew symmetric if

Example. A symmetric matrix is symmetric across its main diagonal (the diagonal running from northwest to southeast). For example,

is symmetric.

Here's a skew symmetric matrix:

Entries which are symmetrically located across the main diagonals are negatives of one another. The entries on the main diagonal must be 0, since they must be equal to their negatives.

The next result is pretty easy, but it illustrates how you can use the definitions of symmetry and skew symmetry in writing proofs. Notice that I'm not writing out the entries of the matrices!

Proposition.

(a) The sum of symmetric matrices is symmetric.

(b) The sum of skew symmetric matrices is skew symmetric.

Proof. (a) Let A and B be symmetric. I must show that is symmetric. Now

The first equality follows from a property I proved for transposes. The second equality follows from the fact that A is symmetric (so ) and B is symmetric (so ).

Since , it follows that is symmetric.

(b) Let A and B be skew symmetric, so and . I must show that is skew symmetric. Now

Therefore, is skew symmetric.