Math

Eigenvalues and Eigenvectors of general linear functions

Take a vector space $V$ over a field $F$.
Let $T: V \to V$ be a linear function, that is $T\left(\alpha v + \beta w \right) = \alpha T\left( v \right) + \beta T\left( w \right)$ for $\alpha, \beta \in F, v,w \in V$.
An eigenvector is a vector $v \in V$ and an eigenvalue is a scalar $\lambda \in F$ such that $T(v) = \lambda v$.
The fixed points of a linear function are the eigenvectors corresponding to the eigenvalue $\lambda =1$.

Example: eigenvectors/eigenvalues of the derivative operator
The derivative operator $D: C^{\infty} \to C^{\infty}$ over the vector space of smooth functions over the field of real numbers is a linear function:
$D\left(\alpha f + \beta g \right) = \alpha D\left( f \right) + \beta D\left( g \right)$ for $\alpha, \beta \in \mathbb{R}, f,g \in C^{\infty}$ (verify this).
Proposition: the set of eigenvalues of $D$ is $ \left( \lambda \in \mathbb{R} \right) $ and the correspoding set of eigenvectors of $D$ is $ \left( f(t)=f_0 e^{\lambda t}, f_{0} \in \mathbb{R} \right)$.
Proof:
$\Rightarrow$ it is straightforward to verify that $D(f) = \lambda f$ for $\lambda \in \mathbb{R}$ and $f(t)=f_0 e^{\lambda t}$.
$D(f) = f’(t) = \lambda f_0 e^{\lambda t} = \lambda f(t)$ $\blacksquare$.
$\Leftarrow$ Suppose there exists a function $g(t) \neq f_0 e^{\lambda t}$ such that $D(g) = \lambda g$.
Then $g’(t) = \lambda g(t)$
Then:
\(\begin{align*} D\left( g(t) e^{-\lambda t} \right) &= g'(t) e^{-\lambda t} + (-\lambda)g(t) e^{-\lambda t} & \text{product rule} \\ &= \lambda g(t) e^{-\lambda t} + (-\lambda)g(t) e^{-\lambda t} & \text{we assumed } g'(t) = \lambda g(t) \\ &= 0 & \text{algebra} \end{align*}\)
$\Rightarrow$ $D\left( g(t) e^{-\lambda t} \right) = 0$
$\Rightarrow$ $g(t) e^{-\lambda t} = c $
$\Rightarrow$ $g(t) = c e^{\lambda t} $ which contradicts our assumption that $g(t) \neq f_0 e^{\lambda t}$ $\blacksquare$.
It follows that the fixed points of the differential operator take the form $f(t)=f_0 e^{t}$.

More examples at this link.