User Tools

Site Tools


misc2

Just for Working Sessions

Nov 27, 2013 and Dec 6, 2013.

We consider some observation operator $H: X\rightarrow Y$ defined on the space $X$. Our goal is to solve an operator equation of the type \begin{equation} \label{eq org} H \varphi = f \end{equation} with $f \in Y$ given. Here, we think of $X$ as a finite dimensional space. Then, it is isomorphic to $\mathbb{R}^n$, with $n \in \mathbb{N}$. In the same way, the finite dimensional space $Y$ will be isomorphic to $\mathbb{R}^m$ for some $m \in \mathbb{N}$. In general, a linear operator $H$ will consist of sums of multiples of the elements of vectors $\varphi \in X$. If we consider each element $\varphi_j$ of $$ \varphi = \left( \begin{array}{cc} \varphi_1
\vdots
\varphi_n \end{array} \right) $$ for $j=1,…n$ to belong to a particular point $x_j$ in physical space $\mathbb{R}^d$ of dimension $d \in \{1,2,3\}$, then in general the operator $H$ is not local in the sense that its outcome belongs to individual points in space and only depends on the input in these points. In general, the space $Y$ will not be local in the sense that each of its variables belongs to one and only one point in the physical space $\mathbb{R}^d$.

Definition. We call an operator $H$ local, if for each variable $f_{\xi}$ for $\xi=1,…m$ there is at most one point $x_{j}$ in $\mathbb{R}^d$ such that $f_{\xi}$ under the operation of $H$ is influenced by variables $\varphi_j$ only if they belong to the point $x_{j}$.

Examples. Consider the matrix \begin{equation} A = \left( \begin{array}{cc} a_{11} & a_{12}
a_{21} & a_{22} \end{array} \right) := \left( \begin{array}{cc} 1 & 1
1 & -1 \end{array} \right) \end{equation} where $\varphi_{1}$ and $\varphi_{2}$ do belong to two different points $x_1$ and $x_2$ in physical space $\mathbb{R}^2$. Then $A$ is not a local matrix, since the first component of $A\varphi$ is influenced by both $\varphi_1$ and $\varphi_2$, i.e. by variables located in two different points $x_1$ and $x_2$ in space. The matrix \begin{equation} B = \left( \begin{array}{cc} b_{11} & b_{12}
b_{21} & b_{22} \end{array} \right) := \left( \begin{array}{cc} 0 & 1
3 & 0 \end{array} \right) \end{equation} however is local, since the output $f_1$ is only influenced by $\varphi_2$ which is located at $x_2$ and $f_{2}$ is only influenced by $\varphi_1$ located at $x_1$. The matrix \begin{equation} \label{C example} C = \left( \begin{array}{cc} c_{11} & c_{12}
c_{21} & c_{22} \end{array} \right) := \left( \begin{array}{cc} 1 & 0
3 & 0 \end{array} \right) \end{equation} is local as well, since both output variables $f_{1}$ and $f_{2}$ are influenced by $\varphi_1$ only.

Lemma. If we have a local operator for which each measusment is influenced by a different point $x_{j} \in \mathbb{R}^d$, then by reordering of the variables it can be transformed into a diagonal operator. In general, when a point influences two or more output variables, diagonalization by reordering is not possible.

Remark. A reordering operation is equivalent to the application of a permutation matrix $P$, i.e. a matrix which has exactly one element 1 in each row and column, with all other elements zero.

Proof. We first assume that in the state space $X = \mathbb{R}^n$ each element belong to one and only one point $x_{j}$, $j=1,…,n$ with $x_{j}\in \mathbb{R}^d$, where all $x_{j}$ are different. Then, each column of the matrix is multiplied with a variable which belongs to a different point $x_j \in \mathbb{R}^d$. The output variable $f_{\ell}$, $\ell=1,…,m$ is influenced by the entries in the $\ell$-th row of the matrix $H$. Thus, this is local if and only if at most one of the entries is non-zero. This applies to every row, i.e. in each row there is at most one non-zero element. By the assumption that different measurements are influenced by different points, this means that there can be at most one nonzero entry in each column as well. But that means that the operator $H$ looks like a scaled version of a permutation matrix $P$, with scaling $0$ allowed. Clearly, by reordering we can make this into a diagonal matrix. In general, we take (\ref{C example}) as counter example, and the proof is complete $\Box$

Question. Our question is: can we find transformations $T: X \rightarrow X$ of $X$ and $S: Y \rightarrow Y$ of $Y$, such that $\tilde{H} := S H T^{-1}$ is local?

An approach using Singular Value Decomposition. By singular value decomposition SVD we have \begin{equation} \label{svd} H = U \Lambda V^{T}, \end{equation} where $\Lambda$ is a digonal matrix and the matrices $U$ and $V$ consist of orthonormal vectors. When $V^T$ and $U$ are invertible, we can proceed as follows. Now, we define $$ T := V^{T} $$ and $$ S = U^{-1} $$ Then, we have \begin{equation} S H T^{-1} = U^{-1} H (V^{T})^{-1} = \Lambda. \label{work1} \end{equation} We have found the desired transformation by SVD. $\Box$



$$ \tilde{\varphi} = T\varphi$$

$$ H\varphi = HT^{-1}T\varphi $$

$$ = H^{\sim}T\varphi = H^{\sim}\varphi^{\sim} $$

such that $H^{\sim}$ is local

misc2.txt · Last modified: 2023/03/28 09:14 by 127.0.0.1