The temperature is not the average kinetic energy.
This is a bad habit which most chemists and physicists pick up because for an ideal gas, and for most systems at normal parts of the phase diagram, this is true. In thermodynamics, however, temperature is defined in terms of the thermodyanic beta as,
$$
\frac{1}{k_b}\frac{\partial S}{\partial E}=\beta
$$
where $\beta=1/(k_bT)$.
That is to say that during a phase change, because the temperature is constant, the ratio of the change in entropy to the change in total energy (kinetic plus potential) is constant. This means that when the phase change just begins and just ends, the ratio of entropy and energy must be identical, and unless this is quite an unusual phase change, this means both need to have changed from their initial values. It wouldn't be a phase change if neither entropy nor energy changed. Because temperature is defined in this way and not as $T\propto\langle \text{KE}\rangle$, then it is perfectly reasonable for the kinetic energy to either increase or decrease during a phase change.
For instance, in normal systems one would expect that the entropy of the liquid would almost certainly be larger than the entropy of the solid. This means that the total energy must decrease by an amount $T_m\Delta S$, which makes sense as one would expect the potential energy (and perhaps the kinetic energy) to decrease when going to the liquid phase. In other words, the entropic gain allows the system to energetically relax. With this picture, it is not too surprising that the average kinetic energy can decrease during a phase change.