Gram-Schmidt Calculator
Welcome to the Gram-Schmidt calculator, where you'll have the opportunity to learn all about the Gram-Schmidt orthogonalization. This simple algorithm is a way to read out the orthonormal basis of the space spanned by a bunch of random vectors. If you're not too sure what orthonormal means, don't worry! It's just an orthogonal basis whose elements are only one unit long. And what does orthogonal mean? Well, we'll cover that one soon enough!
So, just sit back comfortably at your desk, and let's venture into the world of orthogonal vectors!
What is a vector?
One of the first topics in physics classes at school is velocity. Once you learn the magical formula of , you open up the exercise book and start drawing cars or bikes with an arrow showing their direction parallel to the road. The teacher calls this arrow the velocity vector and interprets it more or less as "the car goes that way."
You can find similar drawings throughout all of physics, and the arrows always mean which direction a force acts on an object and how large it is. The scenario can describe anything from buoyancy in a swimming pool to the free fall of a bowling ball, but one thing stays the same: whatever the arrow is, we call it a vector.
In full (mathematical) generality, we define a vector to be an element of a vector space. In turn, we say that a vector space is a set of elements with two operations that satisfy some natural properties. Those elements can be quite funky, like sequences, functions, or permutations. Fortunately, for our purposes, regular numbers are funky enough.
Cartesian vector spaces
A Cartesian space is an example of a vector space. This means that a number, as we know them, is a (1-dimensional) vector space. The plane (anything we draw on a piece of paper), i.e., the space a pairs of numbers occupy, is a vector space as well. And lastly, so is []the 3-dimensional space of the world we live in, interpreted as a set of three real numbers.
When dealing with vector spaces, it's important to keep in mind the operations that come with the definition: addition and multiplication by a scalar (a real or complex number. Let's look at some examples of how they work in the Cartesian space.
In one dimension (a line), vectors are just regular numbers, so adding the vector to the vector is just
Similarly, multiplying the vector by a scalar, say, by is just regular multiplication:
In two dimensions, vectors are points on a plane, which are described by pairs of numbers, and we define the operations coordinate-wise. For instance, if and , then
Similarly, if we want to multiply by, say, 1/2, then
As a general rule, the operations described above behave the same way as their corresponding operations on matrices. After all, vectors here are just one-row matrices. Additionally, there are quite a few other useful operations defined on Cartesian vector spaces, like the cross product. Fortunately, we don't need that for this article, so we're happy to leave it for some other time, aren't we?
Now, let's distinguish some very special sets of vectors, namely the orthogonal vectors and the orthogonal basis.
What does orthogonal mean?
Intuitively, to define orthogonal is the same as to define perpendicular. This suggests that the meaning of orthogonal is somehow related to the 90-degree angle between objects. And this intuitive definition does work: in two- and three-dimensional spaces, orthogonal vectors are lines with a right angle between them.
But does this mean that whenever we want to check if we have orthogonal vectors, we have to draw out the lines, grab a protractor, and read out the angle? That would be troublesome... And what about 1-dimensional spaces? How to define orthogonal elements there? Not to mention the spaces of sequences. What does orthogonal mean in such cases? For that, we'll need a new tool.
The dot product (also called the scalar product) of two vectors and is the number given by: