I am trying to create a chirp signal in the frequency domain, by simply setting the magnitudes of each frequency bin to one, and then setting the phase delay function such that the resulting time delay increases for each frequency bin. I am trying to calculate the length of the resulting chirp from the phase delay function.
If the time delay for each frequency, $td(f)$, equals the period multiplied by the number of cycles introduced by the phase delay, and $\phi(f)$ is the phase delay in radians as a function of frequency, then
$$ td(f) = \frac{\phi(f)}{2\pi}\frac{1}{f} $$
And if we define $\phi(f)$ as,
$$ \phi(f) = 2\pi nf^2 $$
Then it follows that the time delay, $td(f)$, is given by,
$$ td(f) = \frac{2\pi nf^2}{2\pi}\frac{1}{f} = nf $$
And it seems obvious to me that the maximum time delay would occur when f is at its maximum $\frac{fs}{2}$, where $fs$ is the sampling frequency.
However, when I try and synthesise the signal (in Matlab),
% Create frequency axis (0 to nyquist)
fs = 44100;
fAx = 0:1:(fs/2 - 1);
% Define n so td = nf = 0.25s, and define Phases
n = 0.25/(fs/2);
phi = -2pin*fAx^2;
% Calculate the maximum time delay
td = n*(fs/2);
% Calculate the phase of each frequency up to nyquist
chirp_fft = exp(1i*phases);
% Create Hermitian symmetric signal and ifft
chirp_fft = [chirp_fft(1:end-1), 0, flip(conj(chirp_fft(2:end-1)))];
chirp = ifft(chirp_fft);
% Create time axis and plot chirp
tAx = 0:1/fs:(length(chirp) - 1)/fs;
plot(tAx, chirp)
So by my calculation, the chirp should have a maximum delay of 0.25 seconds, however, when I plot it:
It is 0.5s long. When I try this for other values, the resulting chirp is consistently twice as long as my calculation says it should be. Practically this is not a problem, but I'd really like to understand where I've gone wrong! Where is my missing factor of two?
$\phi (f) = \int_{0}^{f} \tau (f) df$
Which gives the group delay, $\phi (f) = n\pi f^2$
So in this case, I still seem to get the same answer for my time delay calculation.
– Leccy PW Oct 26 '20 at 09:40