Velvet Star Monitor

Standout celebrity highlights with iconic style.

news

What does "isomorphic" mean in linear algebra?

Writer Matthew Martinez
$\begingroup$

My professor keeps mentioning the word "isomorphic" in class, but has yet to define it... I've asked him and his response is that something that is isomorphic to something else means that they have the same vector structure. I'm not sure what that means, so I was hoping anyone could explain its meaning to me, using knowledge from elementary linear algebra only. He started discussing it in the current section of our textbook: General Vector Spaces.

I've also heard that this is an abstract algebra term, so I'm not sure if isomorphic means the same thing in both subjects, but I know absolutely no abstract algebra, so in your definition if you keep either keep abstract algebra out completely, or use very basic abstract algebra knowledge, that would be appreciated.

$\endgroup$ 3

3 Answers

$\begingroup$

Isomorphisms are defined in many different contexts; but, they all share a common thread.

Given two objects $G$ and $H$ (which are of the same type; maybe groups, or rings, or vector spaces... etc.), an isomorphism from $G$ to $H$ is a bijection $\phi:G\rightarrow H$ which, in some sense, respects the structure of the objects. In other words, they basically identify the two objects as actually being the same object, after renaming of the elements.

In the example that you mention (vector spaces), an isomorphism between $V$ and $W$ is a bijection $\phi:V\rightarrow W$ which respects scalar multiplication, in that $\phi(\alpha\vec{v})=\alpha\phi(\vec{v})$ for all $\vec{v}\in V$ and $\alpha\in K$, and also respects addition in that $\phi(\vec{v}+\vec{u})=\phi(\vec{v})+\phi(\vec{u})$ for all $\vec{v},\vec{u}\in V$. (Here, we've assumed that $V$ and $W$ are both vector spaces over the same base field $K$.)

$\endgroup$ 6 $\begingroup$

Two vector spaces $V$ and $W$ are said to be isomorphic if there exists an invertible linear transformation (aka an isomorphism) $T$ from $V$ to $W$.

The idea of a homomorphism is a transformation of an algebaric structure (e.g. a vector space) that preserves its algebraic properties. So an homomorphism of a vector space should preserve the basic algebraic properties of the vector space, in the following sense:

$1$. Scalar multiplication and vector addition in $V$ is carried over to scalar multiplication and vector addition in $W$:

For any vectors $x,y$ in $V$ and scalars $a,b$ from the underlying field, $T(ax+by)=aT(x)+bT(y)$.

$2$. The identity element of $V$ is carried over to the identity element of $W$:

If $0_V$ is the identity vector in $V$, then $T(0_V)$ is the identity vector in $W$.

$3$. Vector inversion in $V$ is carried over to vector inversion.

$T(-v)=-T(v)$ for all $v$ in $V$.

$1$ is precisely the property that defines linear transformations, and $2$ and $3$ are redundant (they follow from $1$). So linear transformations are the homomorphisms of vector spaces.

An isomorphism is a homomorphism that can be reversed; that is, an invertible homomorphism. So a vector space isomorphism is an invertible linear transformation. The idea of an invertible transformation is that it transforms spaces of a particular "size" into spaces of the same "size." Since dimension is the analogue for the "size" of a vector space, an isomorphism must preserve the dimension of the vector space.

So this is the idea of the (finite-dimensional) vector space isomorphism: a linear (i.e. structure-preserving) dimension-preserving (i.e. size-preserving, invertible) transformation.

Because isomorphic vector spaces are the same size and have the same algebraic properties, mathematicians think of them as "the same, for all intents and purposes."

$\endgroup$ $\begingroup$

Isomorphism is a rather general notion that occurs in lots of contexts.

Essentially, it means "the same."

In linear algebra, we call two vector spaces $V$ and $W$ isomorphic if there exist linear maps $\alpha: V\mapsto W$ and $\beta: W\mapsto V$ such that $\alpha \circ \beta = \text{id}_W$ and $\beta \circ \alpha = \text{id}_V$. When you have these maps, you can then, using $\alpha$, associate to every vector $v\in V$ a vector $\alpha(v) \in W$. The fact that $\alpha$ is linear means that this map respects the structure of $V$ as a vector space (e.g. for any two vectors $v,w\in V$, $\alpha(v) + \alpha(w) = \alpha(v+w))$, and $\beta$, the inverse to $\alpha$, ensures that you can do the same thing in reverse, from $W$ to $V$. This is probably what your professor ment by having "the same vector structure."

$\endgroup$

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy