In this section we'll learn the basic terminology and definitions used in matrix algebra — how to speak "matrix-ese." You'll develop a few other, deeper concepts as you continue to study the subject, and these will have their own terms and definitions, but these are the absolute foundational ideas. Learn them well to lay a good base of knowledge for later. *The basics always matter the most!*

We'll start with **matrices** (plural of **matrix**). Take a look at the diagram below. A matrix is just a rectangular array of numbers. We refer to each number in a matrix as a **matrix element**. .

We locate each matrix element by the **row** and **column** (in that order) in which it resides in the matrix. Rows are written horizontally, and columns vertically. For example, the element in the 4^{th} row and 5^{th} column might be **x _{45}** (or

We also describe a matrix by the number of rows and columns (in that order) that it contains. A 7×8 matrix, for example, would contain 7 rows of 8 numbers, or 8 columns of 7 numbers. Here's the anatomy of a 3×3 matrix:

Here are some other example of matrices. Often the most useful matrices are **square matrices**, in which the dimensions are the same, $2 \times 2$, $3 \times 3$, and so on.

But we can also have rectangular matrices with dimensions $m \times n$, where $m \ne n$. The matrix below uses some shorthand notation (the dots . . . ) that you'll probably see in some proofs, theorems or procedures about matrices. It's worth taking some time to get used to what it means. You don't want to be writing all 100 elements of a 10×10 matrix out by hand, for example.

You will probably do very little in matrix algebra (or physics) without using **vectors**, so it's just essential to understand them. You might also want to read about vectors here. A vector is just a **one-dimensional matrix**, and it can be written as a single **row** or a single **column** of numbers.

A vector is often written to locate a point in some space. I say "some space," because we're moving toward generalizing from 2- and 3-dimensional space to even more dimensions. It's very difficult to conceive of 4, 5 or 100 dimensional spaces in our heads, but that doesn't mean they aren't mathematically valid — and very useful in very practical ways.

Any ordered pair $(x, y)$ of coordinates on the Cartesian plane is actually just a row vector pointing from the origin (0, 0) to the point of interest.

Likewise, the location of a 3-D point can be written as a 3-D vector (x, y, z), and so on. (Sometimes we call 3D space "xyz-space." Here is an example of a point plotted on a 3-D coordinate system.

Whether a vector is written as a row or a column is a matter of which one is more convenient, as you will see in what's ahead.

Bear in mind that *dimensions* might take on many forms. For example, we could consider the motion of the three atoms of a water molecule in nine dimensions: three for the translation of the molecule along the x-, y- and z-axes, three for the rotation of the molecule about each one of these three axes (any complicated rotational motion can be expressed as a little bit of rotation about each of the main axes, as long as they pass through the center of mass), and three coordinates to represent the vibration (stretching and bending) of the bonds of the molecule. We'd represent that with a nine-dimensional row or column vector.

Vectors can be written as **row vectors** or **column vectors**. Neither is in any way special, but we do make the distinction from time to time for our own convenience, as you will see later.

An **n**-dimensional **vector** is a $(1 \times n)$ or $(n \times 1)$ matrix. The same vector can be written as a column $(n \times 1)$ or a row $(1 \times n)$ with the same meaning. A vector is the location of a point in some $n$-dimensional space, but its direction can also be important.

We can actually now say that any **m** × **n** matrix is actually just a list of **n** m-dimensional column vectors, or a column of **m** n-dimensional row vectors, like this:

Here is a list of a few special matrices with which you should be familiar.

A diagonal matrix is one that has non-zero elements only on the diagonal that extends from element $a_{11}$ to element $a_{nn}$ (upper-left to lower-right) where $n$ is the square dimension of the matrix. We don't define diagnonality for non-square matrices. As you will see later, a diagonal matrix represents a solved system of linear equations.

An acceptable shorthand when dealing with matrices like this is just to write large zeros to represent all of the off-diagonal zero elements, like this:

An upper triangular matrix has all zeros below the diagonal. It may or may not have a few zeros on or above the diagonal. It can also be written using the "big zero" notation above. Manipulating a matrix into upper-triangular or lower-triangular form (below) is often a goal.

A lower-triangular matrix is just the opposite of an upper-triangular matrix.

A lower-triangular matrix is just the opposite of an upper-triangular matrix.

Block-diagonal matrices are those in which the only non-zero elements are symmetrically located along the main diagonal. In the sample here, **a** could be any number (even zero).

A matrix with so many zeros like the one above, regardless of whether it's diagonal, is sometimes referred to as a **sparse matrix**. Sparse matrices are nice to use in calculations that involve very large matrices (dimensions in the thousands or higher) because multiplying by zero is fast compared to multiplying nonzero numbers, say on a computer.

There are many other kinds of matrices and much more matrix terminology, but much of it wouldn't make any sense here without context, so we'll cover that as we go into matrix algebra in more detail in other sections.

A sparse matrix is one in which most of the elements are zeros. When the mathematical operations we perform using matrices become very complicated and time-consuming — in terms of computer time — we often seek to eliminate (or approximate to zero) as many of the elements — usually far from the diagonal — that we can in order to speed things up.

The **transpose** of a matrix is just a rewriting that converts the first row into the first column of the new matrix, the second row to the second column, and so on.

Here is an example (right) using a 3×4 matrix. Notice that even through this pretty drastic rearrangement, the diagonal elements don't change. They are said to be **invariant** to the transpose operation.

X
### Cartesian coordinates

Cartesian coordinates are the normal 2-dimensional (2D) or 3-dimensional (3D) coordinate systems we most-frequently use. In two dimensions, we draw x- and y-axes at 90˚ angles to each other, and in 3D we add a third axis, usually the z-axis, perpendicular to the x-y plane.

The location or direction of an point or particle can be described using Cartesian coordinates (x, y) in the 2D plane, or (x, y, z) in 3D.

**xaktly.com** by Dr. Jeff Cruzan is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. © 2016, Jeff Cruzan. All text and images on this website not specifically attributed to another source were created by me and I reserve all rights as to their use. Any opinions expressed on this website are entirely mine, and do not necessarily reflect the views of any of my employers. Please feel free to send any questions or comments to jeff.cruzan@verizon.net.