The Gabor transform (Short-Time Fourier Transform with Gaussian window) is represented as $U(t,\omega)$, where $t$ is the time, and $\omega$ is the frequency. The author of a monograph (Seismic Inverse Q filtering, pg. 190) is asking for the Gabor spectrum to be transformed from 2D to 1D in the following fashion.
- Define $\chi = \omega t$
- Take the 2D amplitude spectrum $|U(t,\omega)|$ into a 1D amplitude spectrum as $|U(\chi)|$
The issue that I am struggling with is that $\chi$ is the product of $\omega$ and $t$.
The following Matlab code demonstrates that $\chi$ does not continue to increase monotonically. Note the "sawtooth" appearance of $\chi$.
So how do I take the 2D signal as a 1D signal? For each $\chi$, there should be exactly one $|U(\chi)|$.
N0 = 51;
N1 = 1604;
fs = 1.0e3;
f = (fs/2) * linspace(0,1,N0);
omega = 2 * pi .* f;
dt = 1 / fs;
T = (N1-1) * dt;
t = linspace(0,T,N1);
chi = [];
cnt = 1;
for i = 1:length(omega)
for j = 1:length(t)
chi(cnt) = omega(i) * t(j);
cnt = cnt + 1;
end
end
figure;
plot(chi);
xlabel('\chi')
ylabel('Value')

However, the Gabor spectrum is a 2D matrix, similar to this example, where $\omega$ is on one axis, and $t$ is on the other axis:
