First post here so please be easy on me.
I have an application where I have one data point in high dimensions, represented by an $m \times n$ matrix. Call it $X_1$. And I have to compare this data point to many other data points of the same dimension, call them $X_k$ where $k\in \{2,3...,z\}$, give each comparison a score, and then choose the $X_k$ that is most similar ( closest to ) to $X_1$. The way I am currently implementing this score is just the Euclidean distance (Frobenius Norm). Let $s_k$ be the score,
$$s_k = ||X_1-X_k||_F$$
So now I am taking a class where we are looking at many different distance measures. I know my data set has some covariance structure in it, not IID, so it has come up that perhaps a better way of looking at distance is with the Mahalanobis distance, which normalizes the distance from the mean using the covariance matrix, so that distances are relative their positions along the principal components. Here $\hat{x}$ and $\hat{\mu}$ are vectors representing the data point and the mean of the distribution.
$$D_M=\sqrt{(\hat{x}-\hat{\mu})C(\hat{x}-\hat{\mu})^T}$$
My question here is: can I use the theory of the Mahalanobis distance with my scoring function instead of using the Frobenius norm?
The issues I see are that the Mahalanobis distance is not measuring a pairwise but instead the distance from a point to the mean of the distribution. Another issue I see is that my data points are 2-dimensional matrices and the Mahalanobis distance is implemented on vectors.
Thanks
reshapefunction in MATLAB. So, each data point will be an $mn \times 1$ vector. – Maxtron Nov 15 '19 at 15:52