The dot product of a and b, written a.b, is defined by
a.b = a b cos q
where a and b are the magnitudes of a and b and q is the angle between the two vectors that satisfies,
0 O q O p
The dot product is distributive:
a.(b + c) = a.b + a.c
and commutative:
a.b = b.a
Knowing that the angles between each of the i, j, and k vectors is p/2 radians (90 degrees) and cos p/2 = 0, we can derive a handy alternative definition: Let,
u = ai + bj + ck
v = xi + yj + zk
then,
u.v = (ai + bj + ck).( xi + yj + zk)
=>u.v = (ai + bj + ck). xi +
(ai + bj + ck).yj +
(ai + bj + ck).zk
The angle between any nonzero vector and iteself is 0, and cos 0 = 1, so i.i = 1 etc., Hence,
u.v = a x + b y + c z
This means that for any vector, a,
a. a = a2
We can now, given the coordinates of any two nonzero vectors u and v find the angle q between them:
u = ai + bj + ck
v = xi + yj + zk
u.v = u v cos q
u.v = a x + b y + c z
=> u v cos q = a x + b y + c z
=> q = cos-1 o (a x + b y + c z) / ( u v ) p
To get used to this method check out this applet
What would happen if one of the vectors was the null vector 0, from (0,0,0) to (0,0,0). This is the only vector without a direction and it isn't meaningful to ask the angle between this vector and another vector. How does our method fail if we try?
One of the main uses of the dot product is to determine whether two vectors, a and b, are othogonal (perpendicular).
If a . b = 0, then either,
a is orthogonal to b, or
a = 0, or
b = 0.
It will often be useful to find the component of one vector in the direction of another:
We have a given vector a, and we want to see how far it extends in a direction given by the unit vector n. The distance is d, which, from simple trigonometry we can calculate as,
d = a cos q
=> d = n a cos q
=> d = a . n
You have two sides of a triangle, a and b, and the angle in between, C, - the problem is to find the remaining side c. You kill the problem by recalling the cosine formula:
c2 = a2 + b2 - 2 a b cos C
but have you ever seen a proof? The proof by geometry isn't very friendly but with vectors it takes all of 3 lines (using the second triangle above):
c.c = (b - a).(b - a)
=> c2 = b.b + a.a - 2a.b
=> c2 = a2 + b2 - 2ab cos C
From the latitudes and longitudes of two places on the Earth together with the radius of the Earth we can determine the position vectors of the two places with the origin at the centre of the Earth. If you have two points on the circumference of a circle then the radius of the circle times the angle (in radians) subtended by the two points at the centre of the circle gives the arc distance between the two points. Using the dot product we can find the angle subtended by our two position vectors, multiply by the radius of the Earth, and hey presto we have the great circle distance.
Find out the distance between us using this applet (I'm at latitude 53, longitude 0).