## Tensor Calculus – Part 1

May 12, 2008

Firstly, I am sorry for the outage in posts. Could not find time to write up anything worth a really good post. I have received some verbal requests for a new post. I appreciate all such requests! So without further ado, let me get down to business.

Today’s post will be about the Tensor Calculus . In fact I intend to write much more about them in subsequent posts but we will see how that goes. But before I go to give an introduction, let me state my motivation in studying tensors. While some familiarity with Physics is needed to understand the next paragraph, the tensor calculus is a purely mathematical topic.

The motivation comes from my long standing wish to be able to understand the General Theory of Relativity . The GTR is a geometrical theory of gravity. It explains gravity as a result of the deformation of SpaceTime . Mass and Energy cause SpaceTime to be distorted and curved. This distortion in turn acts back on mass to determine its motion. The perceived effects of gravitation on a particle are a direct result of the motion in the distorted SpaceTime without any mysterious force acting on the particle. That is quiet as much as I know about the GTR for now. But the basic mathematical tool to study the GTR is Reimannian Geometry . Reimannian Geometry is the appropriate generalization, for a general $n-$dimensional space, to study things like curves, surfaces, curvature, so common from everyday experience in two and three dimensional spaces. Many basic entities in Reimannian Geometry are best described by Tensors. Tensors are also used in many other places and in order to understand current physical theories and their mathematical formulations, one needs to know about tensors and basic differential geometry.

So, now lets get down to an introduction. Consider an $n-$ dimensional space $V_n$. A coordinate system is an assignment of an $n$ tuple of numbers $x^i, i = 1 \cdots n$ to each point of $V_n$. In a different coordinate system the same point will have a different set of coordinates $\bar{x}^i, i = 1 \cdots n$. We also assume that the $\bar{x}^i$ are given by $n$ functions $\varphi_i(x^j)$ of the $x^i$. The functions are assumed to be such that the Jacobian matrix $\frac{\partial{\bar{x}^i}}{\partial{x^j}}$ is invertible.

Example. In 3 dimensional space, a standard cartesian coordinate system can be the first coordinate system, while the polar coordinates $(r,\theta,\phi)$ can be the primed coordinate system. Or the primed coordinate system can be one with the 3 axes not necessarily at right angles to each other.

Definition. A contravariant tensor of rank 1 is a set of $n$ functions $A^i$ of the coordinates such that they transform according to the following law

$\bar{A}^i = A^j \frac{\partial{\bar{x}^i}}{\partial{x^j}}$

Now in the above expression you might notice that that the index $j$ is repeated in the right hand side of the equation. That implies a summation . Therefore the expression actually means the following sum

$\sum^n_{j=1} A^j \frac{\partial{\bar{x}^i}}{\partial{x^j}}$.

In general, in writing tensorial expressions, the above so called summation convention is widely used. Whenever an index is used as a superscript and the same occurs as a subscript, or if it occurs in the numerator and the denominator it is to be summed on over all the coordinates.

The above definition was for a contravariant tensor of rank 1. In fact a contravariant tensor of rank 1 is just a vector. The tensorial transformation law just generalizes the transformation of vectors. To see this consider 3 dimensional space and basis vectors $e_1, e_2, e_3$. Suppose a vector $v = a_1 e_1 + a_2 e_2 + a_3 e_3$. Consider a different coordinate system with basis vectors $e'_i = \sum_j a_ij e_j$. Now in this coordinate system let $v = a'_1 e'_1 + a'_2 e'_2 + a'_3 e'_3$. Now what are the $a'_i$ as functions of the $a_i$ ? A quick calculation shows that $a' = B a$ where $a'$ is the row vector of coordinates $a'_i$, $a$ is the row vector of coordinates $a_i$ and $B$ is the inverse of matrix $A = (a_{ij})$. Also as one can verify $b_{ij} = \frac{\partial{\bar{x}^i}}{\partial{x^j}}$.

Example. Consider the differentials $dx^i$. Clearly the differentials in the primed coordinate system are given by

$d\bar{x}^i= dx^j \frac{\partial{\bar{x}^i}}{\partial{x^j}}$

Next, we define a covariant tensor of rank 1 .

Definition. A covariant tensor of rank 1 is a set of $n$ functions $A_i$ that transform according to

$\bar{A}_i = A_j \frac{\partial{x^j}}{\partial{\bar{x}^i}}$.

In the next post we will go to the definition of more general tensors called mixed tensors. Such tensors are contravariant or covariant in more than one index or have mix of such indices. We will see the transformation law of such tensors. We will also start with some simple operations on tensors.