This is close to what I am thinking in. If X is an nxn matrix in certain vector space with entries representing some coefficients, it would be wonderful if we can reduce this representation to some form of 2x2 blocks of its states even if it is in some other space. For example, In Markov case...
Thank you for comprehensive analysis. But should be any closed form for that function of the eigen value of A and the eigenvalues of B?
For example, suppose X represents the transitional probability in Markov chain process, will I be able to represent it, using certain transformation, by a...
But the matrices in the last example have rows and columns which are linear combination of each other. What if we have a matrix with random entries like the one in your post #4. Again, will be any relationship between the eigen values of the matrix and its blocks matrices?
I used online matrix calculator and I found no obvious correlation between the eigen vectors or eigen values of the ##\begin{bmatrix}\mathbf{A} & 0 \\ 0 & \mathbf{B}\end{bmatrix}## and ##\begin{bmatrix}0 & \mathbf{A} \\ \mathbf{B} & 0\end{bmatrix}##. There is of course obvious similarity as...
This is clear if C is a diagonal matrix with entries are real numbers, in such case, the eigen vectors of C is ##(1,0,0...0)^T##, ...##(0,0,0...1)^T## and the eigen values are the numbers themselves. The eigen vectors and eigen values of each block will follow the same pattern.
What if C has...
If there is matrix that is formed by blocks of 2 x 2 matrices, what will be the relation between the eigen values and vectors of that matrix and the eigen values and vectors of the sub-matrices?
I appreciate your contribution to answer but frankly I have no clear idea about your words. My question is clear from the beginning and still I have no answer on it, will I be able to decompose UA into U and A where is U is a unique unitary and A a unique diagonal or no?
U and A are square matrices of the same rank, so UA is a square matrix too and it should be invertible. But even in this case, how to find A and its inverse from UA? In other words, can we decompose UA into U and A?
Suppose we have a product formed by a multiplication of a unitary matrix U and a diagonal matrix A, can we retrieve the inverse of A without knowing either U or A?
In Latent semantic analysis, the truncated singular value decomposition (SVD) of a term-document matrix ##A_{mn}## is
$$A=U_rS_rV^T_r$$
In many references including wikipedia, the new reduced document column vector in r-space is scaled by the singular value ##S## before comparing it with other...
Ok, here is an attached image of the tarjectory matrix X, the column vector of length L which is the window length of the series. Now suppose that the time series that is represented by this matrix has a period which is just equal to the time between 2 successful Xs values. For example, the...
Let's have a time series with a period=L. Suppose we arbitrarily choose the window length of the trajectory matrix to be equal to L which is also equal to the period of a time series. Then the second column of the matrix will also start with the same entry as the first column, because all...
By efficient, I meant the ability of the algorithm to get all possible information about the spectral components of the time series. Real time analysis is an important concern too.
So I have two concerns here:
1) does that mean we can construct infinite number of matrices U and V' by arbitrarily choosing orthonormal columns vectors of U and row vectors of V'?
2) back to my question, U And V' are constructed from AA'=USSU' and A'A=VSSV', but if this is applied for the...
In a given matrix A, the singular value decomposition (SVD), yields A=USV`. Now lets make dimension reduction of the matrix by keeping only one column vector from U, one singular value from S and one row vector from V`. Then do another SVD of the resulted rank reduced matrix Ar.
Now, if Ar is...
But in linear regression, we seek to calculate the regressors β0 and β1 by using different xij as representing χ matrix of independent variables. In my example, I am doing the opposite by seeking calculation of the system response function represented by matrix, χ in analogue with linear...
Do you mean multivariate linear regression like Y=XB, with Y is a random vector, B is a regressor vector and X is a matrix? Can you explain more please?
This is a good and simple method to calculate all ai.m. But still it can not grantee that rows and columns of A are not linearly dependent.
Suppose for simplicity, that A is (2x2) matrix. For A to be diagonalizable, the following condition must be satisfied; a21/a11 ≠a22/a12.
So no matter which...
In my mind, the linear dependency depends on the values of the matrix because the inputs can be chosen arbitrarily. But the values of matrix cells are themselves unknown, so how to make sure that the matrix coefficients can maintain linear independent sets of equations before being calculated?
Suppose we represent the input information as a (nx1) column vector, the output information as another (nx1) column vector and the system response function as a (nxn) matrix. My question, is it possible to calculate the values of the cells of the matrix knowing the input and the output?
For...