In mathematics, a tensor is (in an informal sense) a generalized linear 'quantity' or 'geometrical entity' that can be expressed as a multi-dimensional array relative to a choice of basis of the particular space on which it is defined. The intuition underlying the tensor concept is inherently geometrical: as an object in and of itself, a tensor is independent of any chosen frame of reference. However, in the modern treatment, tensor theory is best regarded as a topic in multilinear algebra. Engineering applications do not usually require the full, general theory, but theoretical physics now does. Tensors are of importance in physics and engineering. In the field of diffusion tensor imaging, for instance, a tensor quantity that expresses the differential permeability of organs to water in varying directions is used to produce scans of the brain; in this technique tensors are in effect made visible. Perhaps the most important engineering examples are the stress tensor and strain tensor, which are both 2nd rank tensors, and are related in a general linear material by a fourth rank elasticity tensor. Specifically, a 2nd rank tensor quantifying stress in a 3-dimensional/solid object has components which can be conveniently represented as 3x3 array. The three Cartesian faces of a cube-shaped infinitesimal volume segment of the solid are each subject to some given force. The force's vector components are also three in number (being in three-space). Thus, 3x3, or 9 components are required to describe the stress at this cube-shaped infinitesimal segment (which may now be treated as a point). Within the bounds of this solid is a whole mass of varying stress quantities, each requiring 9 quantities to describe. Thus, the need for a 2nd order tensor is produced. While tensors can be represented by multi-dimensional arrays of components, the point of having a tensor theory is to explain further implications of saying that a quantity is a tensor, beyond specifying that it requires a number of indexed components. In particular, tensors behave in specific ways under coordinate transformations. The abstract theory of tensors is a branch of linear algebra, now called multilinear algebra. Wiki Markup | _From_ [_Tensor_|http://en.wikipedia.org/wiki/Tensor]
Orthogonal transformations are transformations *y* = *Ax* with *A* an orthogonal matrix. With each vector *x* in Rn such a transformation assigns a vector *y* in Rn.
_From Kreyszig, "Advanced Engineering Mathematics", p. 382_
A real square matrix *A* = \[ajk\] is called *orthogonal* if transposition gives the inverse of *A.*
*AT* = *A-1*
_From Kreyszig, "Advanced Engineering Mathematics", p. 381_
It is practical to define *transposition* for any matrix. The *transpose AT* of an _m_ x _n_ matrix *A* = \[ajk\] is the _n_ x _m_ matrix that has the first row of *A* as its first column, the second row of *A* as its second column, and so on.
_From Kreyszig, "Advanced Engineering Mathematics", p. 307_
Say that S' = AS, where A is an orthogonal Tensor Orthogonal transformations are transformations y = Ax with A an orthogonal matrix. With each vector x in Rn such a transformation assigns a vector y in Rn. From Kreyszig, "Advanced Engineering Mathematics", p. 382 A real square matrix A = [ajk] is called orthogonal if transposition gives the inverse of A. AT = A-1 From Kreyszig, "Advanced Engineering Mathematics", p. 381 It is practical to define transposition for any matrix. The transpose AT of an m x n matrix A = [ajk] is the n x m matrix that has the first row of A as its first column, the second row of A as its second column, and so on. From Kreyszig, "Advanced Engineering Mathematics", p. 307 Say that S' = AS, where A is an orthogonal matrix.
|